Finger Knuckle
Finger Knuckle
Finger Knuckle
119
A. Acquisition of finger-knuckle images In order to determine the chain of points, the image is
The knuckle image acquisition task was performed by use skeletonized and then the points of ends and bifurcations of the
of a special device. This device consists of a box with a digital furrows are marked on the image. Location of such points is
camera and three LED-type lights. During acquisition process, determined by imposing of the 3x3 mask on each image pixel.
the camera focuses on the index finger knuckle. By using a Then, for each black pixel the value of J is calculated by
dedicated application the image is captured directly from the following equation:
camera and used in further stages. Proposed device is presented 1 1
in Fig. 2.
J ( x, y ) I ( x a, y b),
a 1 b 1
x [2,..., W 1], y [2,..., H 1],
pT ( x, y ) if J ( x, y ) 2
p ( x, y ) B .
p ( x, y ) if J ( x, y ) 3
120
C. Verification stage The knuckle image Kv is compared with all N knuckle
In order to compare two knuckle images by means of images stored in the database for the person who claimed the
identity. As a result of comparisons, we obtain the set .
LSCA method we need to create two sets K A {c1A , c2A ,..., cmA}
General principle of construction of the set is shown below:
and K B {c1B , c2B ,..., cnB } , describing images being compared.
These sets are obtained from Algorithm 1.
{d ''( K v , K1A ), d ''( K v , K2A ),,..., d ''( K v , K NA )}
Before comparison, the coordinates of all points of chains
are normalized to the interval [0, 1] and centered:
where KiA is the i’th knuckle image belonging to the person
n being verified, N is the number of all knuckle images in the
j 1
sj database of the person being verified.
s S, s i i s i
n
i 1,..., n Next, the knuckle image Kv is compared with knuckle
images of the other users, randomly chosen from the database.
n
The results of comparisons are stored in set :
j 1
qj
q Q, q i i q i
n
i 1,..., n
{d ''( K v , K1B ), d ''( K v , K 2B ),,..., d ''( K v , K NB )}
In least-square counter alignment method comparison
between two chains of points is given by the following formula
[18]: where KiB is the i’th knuckle image which belongs to any
person from the database, but not the person being verified.
d ( S , Q) min S t , , k (Q) F After creating the sets and , the mean values of each
t2 , [ , ), k 0
of these sets are calculated as follows:
where a , ,k - is the operator of translation by the vector
t 2 , rotation by the angle [ , ) , and scaling by the mean{} mean{ }
factor k > 0,
The ultimate decision of user verification depends on values of
M F
- denotes the Frobenius norm of the matrix M. sets and :
The LSCA method requires that the data being compared,
must have the same number of elements. In the case of knuckle genuine knuckle if
decision
images, this condition is not always fulfilled, because each forged knuckle otherwise
knuckle furrow can have a different length (number of points).
In our work we used scaling method called Fixed Number of
Points (FNP) to equalize the length of furrows being compared. III. EXPERIMENTS
This method was described in detail in [13,16]. The detail
description of the least-square method is presented in the work The database used in experiments contained 150 finger
[10]. knuckle images, collected from 30 people, 5 images for each
person.
The equation (5) allows comparison between two chains of
point. In order to compare all furrows in two sets KA and KB we Evaluation of the effectiveness of our method was
use the following equation: conducted by detection of forgeries during verification stage.
An image of a person being verified was compared with his/her
m
images stored in the database, and also with randomly selected
1 images of the other people. By calculating the accuracy, we
d '( K A , K B )
m min d (c
i 1
A B A B
i , c1 ),..., d (ci , cn ) ,
measured the effectiveness of our method.
During the testing stage, the influence of two following
where K A {c1A , c2A ,..., cmA} and K B {c1B , c2B ,..., cnB } . parameters on this method were examined:
It should be noted that coefficient (6) is not symmetrical, parameter N - the number of all knuckle images in the
therefore, the final equation for comparing two knuckle images database of the person being verified, see eq. (8) and
is as follows: (9),
parameter FNP - length (number of points) of the
d ''( K A , K B ) min( d '( K A , K B ), d '( K B , K A )) furrows being compared.
The results of the tests carried out for different values of the
In the verification phase, an unknown person claims parameters FNP and N are presented in Table 1.
identity and provides a knuckle image Kv to be verified.
121
TABLE I. RESULTS OF INVESTIGATIONS.
FNP
N Mean
25 50 75 shorter chain longer chain
1 91.45 0.50 91.75 0.33 90.68 0.77 89.26 0.36 89.80 0.92 90.59 0,58
2 92.77 0.33 91.52 0.25 91.59 0.69 89.16 0.40 89.55 0.38 90.80 0,41
3 92.37 0.14 91.71 0.19 91.65 0.38 88.95 0.58 89.38 0.14 90.81 0,29
4 91.70 0.30 91.48 0.32 91.82 0.25 88.92 0.24 89.41 0.25 90.67 0,27
5 92.23 0.07 91.94 0.13 92.04 0.19 89.25 0.08 88.76 0.49 90.84 0,19
6 91.81 0.18 91.75 0.07 91.99 0.07 89.38 0.12 89.26 0.23 90.84 0,13
7 91.72 0.15 91.95 0.06 91.99 0.20 89.41 0.17 89.15 0.11 90.85 0,14
8 92.19 0.06 91.94 0.10 91.84 0.08 89.19 0.05 88.72 0.15 90.8 0,09
9 91.98 0.19 91.87 0.10 91.99 0.04 89.30 0.08 88.99 0.21 90.2 0,13
10 91.95 0.06 91.93 0.09 91.89 0.06 89.39 0.03 89.23 0.14 90.8 0,08
Mean 91.96 0,20 91.78 0,16 91.75 0,27 89.22 0,21 89.22 0,30
By analyzing the results in Table 1, we can see, that the [6] A. Jain, P. Flynn, A. A. Ross (Eds.), “Handbook of Biometrics”.
value of N does not significantly affect the verification Springer, 2008.
accuracy. Nevertheless, this parameter influences on the [7] A. Kumar, C. Ravikanth, “Personal authentication using finger knuckle
surface”, IEEE Transactions on Information Forensics and Security vol.
stability of results, and it was confirmed by the values of the 4(1) , 2009, pp. 98–110.
standard deviations, where the greater value of N gives more
[8] A. Kumar, B. Wang, “Recovering and matching minutiae patterns from
stable results. After analyzing the FNP parameter, we noticed finger knuckle images”, Pattern Recognition Letters vol. 68, 2015, pp.
that, by decreasing the FNP values, the level of accuracy 361–367.
increases. Consequently, the best accuracy is equal to 92.77% [9] A. Kumar, Y. Zhou, “Human identification using knuckle codes”, IEEE
and was obtained for N=2 and FNP = 25. 3rd International Conference on Biometrics: Theory, Applications, and
Systems, 2009, pp. 98–109.
[10] I. Markovsky, S. Mahmoodi, ”Least-squares contour alignment”, IEEE
IV. CONCLUSIONS Signal Processing Letters, 16 (1), 2009, pp. 41–44.
In this paper we described a new method of person [11] A. Morales, C.M. Travieso, M.A. Ferrer, et al., “Improved finger-
verification by use of finger knuckle image recognition. We knuckle-print authentication based on orientation enhancement”,
obtained high effectiveness of verification using least-square Electronics Letters vol. 47(6) , 2011, pp. 380–382.
counter alignment method. The experiments show that finger [12] N. Otsu, “A Threshold Selection Method from Gray-Level Histograms”,
IEEE Transactions on Systems, Man and Cybernetics vol. 9(1), 1979,
knuckle images-based method is a promising solution in pp. 62–66.
biometrics area. The method can also be used in multi- [13] M. Palys, R. Doroz, P. Porwik, “On-line signature recognition based on
biometics solutions. This approach will be analyzed in our an analysis of dynamic feature”, IEEE Int. Conference on Biometrics
further investigations. and Kansei Engineering (ICBAKE 2013), Tokyo Metropolitan
University Akihabara, 2013, pp. 103–107.
Additionally, future research will be conducted by testing [14] T. Pavlidis, “A thinning algorithm for discrete binary images”,
another databases and more advanced classification methods. Computer Graphics and Image Processing vol. 13(2), 1980, pp. 142–
Further study will also be concentrated on analysis of the other 157.
methods of image pre-processing. It will allow to extracted the [15] P. Porwik, R. Doroz, K. Wrobel, “A new signature similarity measure”,
furrows more accurately from the finger knuckle images. Proceedings of World Congress on Nature and Biologically Inspired
Computing, 2009, pp. 1022–1027.
[16] P. Porwik, R. Doroz, “Self-adaptive biometric classifier working on the
REFERENCES reduced dataset”, Lecture Notes in Computer Science vol. 8480, 2014,
[1] P. Campisi (Ed.), “Security and Privacy in Biometrics”. Springer, 2013. pp. 377–388.
[2] Ng. Choon-Ching, Y. Moi Hoon, N. Costen, B. Li, “Automatic Wrinkle [17] T. E. Wesołowski, R. Doroz, K. Wrobel, H. Safaverdi, “Keystroke
Detection Using Hybrid Hessian Filter”, Lecture Notes in Computer Dynamics and Finger Knuckle Imaging Fusion for Continuous User
Science vol. 9005, 2015, pp. 609–622. Verification”, Lecture Notes in Computer Science, vol. 10244, pp. 141-
[3] R. Doroz, et al., “A New Personal Verification Technique Using Finger 152, 2017.
Knuckle Imaging”, Lecture Notes in Computer Science vol. 9876, 2016, [18] K. Wrobel, R. Doroz, “The new method of signature recognition based
pp. 515–524. on least squares contour alignment”. Proceedings of the International
[4] M.A. Ferrer, C.M. Travieso, J.B. Alonso, “Using hand knuckle texture IEEE Conference on Biometrics and Kansei Engineering (ICBAKE
for biometric identifications”, IEEE Aerospace and Electronic Systems 2009) , 2009, pp. 80–83.
Magazine vol. 21(6), 2006, pp. 23–27. [19] K. Wrobel, R. Doroz, P. Porwik, “Fingerprint Reference Point Detection
[5] Y. Iwahori, A. Hattori, Y. Adachi et al., “Automatic detection of polyp Based on High Curvature Points”, Lecture Notes in Computer Science
using Hessian Filter and HOG features”, Procedia Computer Science vol. 9714, 2016, pp. 538–547.
60(1), 2015, pp. 730–739. [20] M. Xiong, W. Yang, C. Sun, “Finger-knuckle-print recognition using
LGBP”, Lecture Notes in Computer Science vol. 6676, 2011, pp. 270–
277.
122