Fingerprint Image Processing For Generating Biometric Cryptographic Key 2008 Thesis 1252715654
Fingerprint Image Processing For Generating Biometric Cryptographic Key 2008 Thesis 1252715654
Al Tarawneh Mokhled
To my children, Zaid, Saba , Banan and Duaa who are the reason
to try and make the world a safer place.
ii
Abstract
Cryptography and biometrics have been identified as two of the most important aspects
of digital security environment. For various types of security problems the merging
between cryptography and biometrics has led to the development of Bio crypt technology.
The new technology suffers from several limitations and this thesis, addresses the
biometric information quality and the security weakness of cryptography. In many
applications fingerprint has been chosen as a core of bio crypt combined technology due
to it’s maturity in terms of availability, uniqueness, permanence, feasibility, ease of use
and acceptance. Fingerprint has been studied from the point of view of information
strength to suitability to the cryptographic requirement. The factors relating to generating
and constructing combined bio crypt key such as biometric image validity, quality
assessment and distinct feature extraction are studied to avoid corruptness of the source
biometric images. A number of contributions are made in this work, firstly, the analysis
of the validity and quality robustness of fingerprint images is undertaken, and a novel
algorithm is realised for circumventing these limitations. Secondly, new algorithms for
increasing the management security of image based biometric keys is described, via
shielding bio crypto information as another line of defence against serious attack. Finally,
fingerprint feature vector is proposed to replace minutiae based fuzzy vault to output high
entropy keys. This allows the concealment of the original biometric data such that it is
impossible to recover the biometric data even when the stored information in the system
is open to an attacker.
i
Acknowledgments
[Allah raises the ranks of those among you who believe and those who
were granted the knowledge.] Qur'an
I would also like to express my sincere thanks to Dr. W.L. Woo for his
supervision, courteous, kindness. His helpful comments, constructive criticism and
invaluable suggestions made this work successful.
Thanks also have to go to my friend Lloyd Palmer for his valuable time in
proofreading earlier draft of the thesis.
Mokhled S. AlTarawneh
ii
List of Publications
[1] M. S. Al-Tarawneh, L. C. Khor, W. L. Woo, and S. S. Dlay, "Crypto key
generation using contour graph algorithm" in Proceedings of the 24th IASTED
international conference on Signal processing, pattern recognition ,and
applications Innsbruck, Austria ACTA Press, 2006, pp. 95-98 .
[6] M.S. ALTARAWNEH, W.L. WOO, and S.S. DLAY, "A Hybrid Method for
Fingerprint Image Validity and Quality Computation," accepted in The 7th
WSEAS International Conference on SIGNAL PROCESSING, ROBOTICS and
AUTOMATION (ISPRA '08), Cambridge, UK, February 20-22, 2008
[7] M.S. ALTARAWNEH, W.L. WOO, and S.S. DLAY, "Biometric Key Capsulation
Technique Based on Fingerprint Vault: Analysis and attack," accepted in the 3rd
iii
IEEE INTERNATIONAL CONFERENCE ON INFORMATION &
COMMUNICATION TECHNOLOGIES: FROM THEORY TO
APPLICATIONS, ICTTA08, Umayyad Palace, Damascus, Syria, April 7 - 11,
2008.
[8] M.S. ALTARAWNEH, W.L. WOO, and S.S. DLAY, "Fuzzy Vault Crypto
Biometric Key Based on Fingerprint Vector Features", accepted in the 6th
Symposium on Communication Systems, Networks and Digital Signal Processing,
Graz University of Technology, Graz.
iv
Abbreviations
AFAS Automatic Fingerprint Authentication System
AFIS Automatic Fingerprint Identification System
BE Biometric Encryption
BKC Biometric Key Capsulation
BS Background Subtract
BW Berlekamp Welch
CA Certificate Authority
CBCG Contour Based Construction Graph
CN Crossing Number
CP Chaff Point
CRC Cyclic Redundancy Check
CSF Contrast Sensitivity Functions
DB Data Base
DC Directional Contrast
DFT Discrete Fourier Transform
DT Determine Threshold
EER Equal Error Rate
EK Encapsulated Key
EPK Encryption Provider Key
FAR False Acceptance Rate
FE Fuzzy Extractor
FFV Fingerprint Fuzzy Vault
FMR False Matching Rate
FNMR False None Matching Rate
FP Fingerprint
FR False Rate
v
FR Full Reference
FRR False Reject Rate
FVC Fingerprint Verification Competition
FVS Fuzzy Vault Scheme
GAR Genuine Accept Rate
GF Gabor Feature
GF Galois Field
GS Gabor Spectrum
GSM Gabor Spectral Method
HLK Header Locker Key
HVS Human Visual System
ICP Iterative Closest Point
IQA Image Quality Assessment
IQF Image Quality of Fingerprint
IQS Image Quality Survey
IT Information Technology
ITU International Telecommunication Union
MINDTCT Minutiae Detection
MOS Mean Opinion Score
MLP Multi Layer Perceptron
MSE Mean Squared Error
MR Matrices Regenerator
NFIQ Fingerprint Image Quality
NIST National Institute of Standards and Technology
NN Neural Network
NR No Reference
OAS Object Area Segmentation
OCL Orientation Certainty Level
vi
OF Orientation Field
PCA Principle Component Analysis
PET Privacy Enhancing Technology
PKC Public Key Cryptography
PKI Public Key Infrastructure
PLCC Pearson Linear Correlation Coefficient
PPI Pixel Per Inch
PR Polynomial Reconstruction
PS Power Spectrum
PSNR Peak Signal-to-noise Ratio
PWC Pixels Weight Calculation
QA Quality Assessment
QI Quality Index
ROC Receiver Operating Characteristic
ROI Region Of Interest
RP Reference Point
RR Reduced Reference
RS Reed-Solomon
SKC Secret Key Cryptography
SP Singular Point
SROCC Spearman Rank Order Correlation Coefficient
SSIM Structural Similarity
SWA Slicing Window Algorithm
SW Slicing Window
TAR True Acceptance Rate
TM Transformed Matrix
TR True Rate
TRR Threshold Ratio
vii
w.r.t With respect to
VCA Validity Check Algorithm
VHG Vector Header Generator
VS Vault Set
WSQ Wavelet Scalar Quantization
viii
Table of Contents:
Chapter 1 Introduction.................................................................................... 1
1.1 Background ......................................................................................................... 1
1.2 Biometric Systems .............................................................................................. 3
1.3 Cryptography ...................................................................................................... 5
1.4 Biometric and Cryptography Merging................................................................ 7
1.5 Aims and Objectives ........................................................................................... 9
1.6 Original Contributions ........................................................................................ 9
1.7 Thesis Outline ................................................................................................... 11
ix
3.6.1 Subjective Test.......................................................................................... 60
3.6.2 NIST Fingerprint Image Quality Test....................................................... 61
3.6.3 VCA Test .................................................................................................. 61
3.7 Summary ........................................................................................................... 63
x
5.3.1 Contour graph analysis ........................................................................... 112
5.4 Slicing Window Algorithm............................................................................. 114
5.4.1 Slicing window analysis ......................................................................... 119
5.5 Summary ......................................................................................................... 121
xi
7.2 Future Work .................................................................................................... 157
xii
List of Figures
Figure 1-1 Block diagram of a generic biometric system [7] ............................................. 4
Figure 1-3 Cryptography types: a) secret-key, b) public key, and c) hash function........... 6
Figure 2-1 Extraction of a local region and transformation to vertical aligned pattern.... 15
Figure 2-4 Foreground/background segmentation: (a) origin image; (b) quality field
(Standard deviation of m Gabor features); (c) segmented image ..................................... 17
Figure 2-10 Fingerprint minutiae features (x, y, θ ) extracted using the Truth tool CUBS,
developed at centre for Unified Biometrics and Sensors, University at Buffalo.............. 38
Figure 2-11 Fuzzy fingerprint vault : (a) vault encoding, (b) vault decoding [65]........... 39
Figure 3-2. (a)Fingerprint image, (b) histogram of fingerprint image, (c) region of interest,
(d) ROI of a fingerprint image .......................................................................................... 49
Figure 3-5. Sample images, with different validity and quality ....................................... 56
Figure 3-7 (a) Objects segmented areas, (b-b') object weighted areas ............................. 60
xiii
Figure 3-8 Approaches scattering relation....................................................................... 62
Figure 4-1 (a) Fingerprint image capturing position and placement , (b) Orientation field
........................................................................................................................................... 65
Figure 4-3 Very few minutiae for images from FVC2002 .............................................. 66
Figure 4-5 Diagram of a full reference image quality assessment system ....................... 75
Figure 4-6 Block diagram of conventional reduced reference image quality methods. .. 76
Figure 4-14: Scatter plot of PS vs. MOS with Pearson correlation: 0.7822 ..................... 91
Figure 4-15: Scatter plot of DC vs. MOS with Pearson correlation: 0.7641 .................... 91
Figure 4-16: Scatter plot of GF vs. MOS with Pearson correlation: 0.8231 .................... 92
Figure 4-17: Scatter plot of NN vs. MOS with Pearson correlation: 0.8009.................... 92
Figure 4-18: Scatter plot of GSM vs. MOS with Pearson correlation: 0.8811................. 93
Figure 4-19: False rate (FR) versus True rate TR of image quality assessment approaches
........................................................................................................................................... 94
Figure 5-2 Block diagram for minutiae based feature extraction ..................................... 99
xiv
Figure 5-3: (a) Ridge ending CN=1, (b) Bifurcation CN=3 and (c) The eight connected
neighbourhood of the pixel P in the 3x3 projected window. .......................................... 100
Figure 5-5 Original fingerprint image with its result of orientation field computation.. 106
Figure 5-8 Contour Based Construction Graph algorithm block diagram...................... 109
Figure 5-11 Adjacency Matrix for the given graph in Figure (5-9)................................ 111
Figure 5-17 Generated keys, where HLK is Header Locker Key, EPK is Encryption
Provider Key. .................................................................................................................. 119
Figure 6-1 Fingerprint minutiae fuzzy vault message encryption. ................................. 124
Figure 6-2 Fingerprint minutiae fuzzy vault message decryption. ................................. 124
Figure 6-7 Fingerprint Vault Decryption implementation model (dashed box) ............ 136
Figure 6-8 Effect of points parameter (a) true points, (b) chaff points......................... 138
xv
Figure 6-9 Effect of threshold parameter........................................................................ 139
Figure 6-14 The relationship between chaff points, minimum distance and release- ability
of locked key................................................................................................................... 146
Figure 6-15 The relationship between chaff points, Polynomial degree, vault complexity
......................................................................................................................................... 147
Figure 6-17 The attack complexity varies according to the degree of polynomial ........ 152
Figure 6-18 The relationship between the key releasability and the minimum distance.153
xvi
List of Tables
Table 1-1 Comparison of various biometric technologies, according to A. Jain [2], U.
Uludag [5], the perception based on (High=100, Medium=75, Low=50) .......................... 3
Table 4-1 Part of "MOS-IQS, PS, DC, GF and NN- NFIQ quality results",.................... 89
Table 5-4 Uniqueness of generated keys where logical 1 (true) value indicates full
matching and logical 0 (false) otherwise. ....................................................................... 121
xvii
Chapter 1 Introduction
1.1 Background
Technology brings a new dimension to biometrics in this information society era, while
biometrics brings a new dimension to individual identity verification. It provides a
guaranteed level of accuracy and consistency over traditional methods. Biometrics means
“The statistical analysis of biological observations and phenomena”. It refers to the use of
distinctive physical (e.g., fingerprints, face, retina, iris, hand geometry, palm) and
behavioural (e.g., gait, signature, speech) characteristics for automatically recognizing
individuals [1, 2]. Biometric based identification relies on “something that you are”, or
“something that you do”, and hence it differentiate between an authorized person and an
impostor [3]. Any physiological or behavioural human characteristic can be used as a
biometric as long as it satisfies the following requirements [4]:
¾ Acceptability, refers to what extent people are willing to accept the biometric
system.
1
Biometric characteristics provide a unique natural signature of a person and it is widely
accepted. While some of the requirements described above like universality, and
collectability are relatively easy to verify for certain human characteristics, others like
immutability, and uniqueness require extensive tests on a large number of samples in
order to be verified. Each biometric technique has its advantage and disadvantage. The
applicability of a specific biometric technique depends heavily on the application domain.
No single biometric can meet the entire requirement (e.g. accuracy, cost, practicality)
which means no biometric is “optimal” [5]. Fingerprints have been used as a biometric
characteristic because they could offer unique advantages over other biometrics in terms
of acquisition ease, relative temporal invariance, and uniqueness among different subjects
[6]. A brief comparison of biometric techniques based on seven factors is provided in
Table1-1. In this sense, each biometric technique is admissible. For example, it is well
known that both the fingerprint technique and the iris scan technique perform much better
than the voice print technique in terms of accuracy and speed. As can be seen from Table
1-1, overall fingerprints perform better than other biometric techniques. Fingerprint has
its own distinctiveness that has been used for personal identification for several years.
Fingerprint identification is based on two basic premises, 1. Persistence: the basic
characteristics of fingerprints do not change with time. 2. Individuality: everybody has a
unique fingerprint. Biometrics can operate in one of two modes: the identification mode,
in which the identity of an unknown user is determined, and the verification mode, in
which a claimed identity is either accepted or rejected. On this basis biometrics were
applied in many high end applications, with governments, defence and airport security
being major customers. However, there are some arenas in which biometric applications
are moving towards commercial application, namely, network/PC login security, web
page security, employee recognition, time and attendance systems, and voting solutions.
While biometric systems have their limitations they have an edge over traditional security
methods in that they cannot be easily stolen or shared. Besides bolstering security,
biometric systems also enhance user convenience by alleviating the need to design and
remember passwords.
2
Universality
Uniqueness
Permanence
Collectability
Performance
Acceptability
Circumvention
Average
Biometrics
Data Collection: This subsystem uses a sensor or camera to acquire the image of the
biometric trait of the user.
Transmission: This subsystem transmits the data collected from data collection module
after compressing it, to the data storage and signal processing module.
3
Signal Processing: This is the most important module of the system. It performs feature
extraction by image processing techniques and pattern matching operations.
Decision: This module performs identification or verification by using the match scores.
This thesis is concerned with the important issues of data collection, storage, and data
processing to merge biometric and cryptography for binding and generating bio crypt.
The Figure (1-1) below shows that of the first point of any biometric system is the
acquisition box which means acquiring of biometric data from the user. To this box this
work will add an automated validity checker and quality assessor to enhance the system
performance.
4
due to poor system performance. If the performance is poor the security will be
compromised, and there may be excessive dependence on the fallback system.
1.3 Cryptography
Cryptography is the practice and study of hiding information. Cryptography refers almost
exclusively to encryption, the process of converting ordinary information, i.e. plain text,
into unintelligible data, i.e. ciphertext [8]. Decryption is the reverse, moving from
unintelligible ciphertext to plaintext, Figure (1-2). A cipher is a pair of algorithms which
perform this encryption and the decryption. The detailed operation of a cipher is
controlled both by the algorithm and, in each instance, by a key. This is a secret
parameter (ideally, known only to the communicants) for a specific message exchange
context. Keys are important, as ciphers without variable keys are easily breakable and
therefore less than useful for most purposes. Historically, ciphers were often used directly
for encryption or decryption, without additional procedures such as authentication or
integrity checks.
a) Secret Key (symmetric) cryptography. SKC uses a single key for both encryption
and decryption.
b) Public key (asymmetric) cryptography. PKC uses two keys, one for encryption and
the other for decryption.
c) Hash function (one-way cryptography). Hash functions have no key since the
plaintext is not recoverable from the ciphertext.
Figure 1-3 Cryptography types: a) secret-key, b) public key, and c) hash function.
6
Cryptography is a particularly interesting field because of the amount of work that is, by
necessity, done in secret. The irony is that today, secrecy is not the key to the goodness of
a cryptographic algorithm. Regardless of the mathematical theory behind an algorithm,
the best algorithms are those that are well-known and well-documented because they are
also well-tested and well-studied. In fact, time is the only true test of good cryptography;
any cryptographic scheme that stays in use year after year is most likely a good one [10].
The strength of cryptography lies in the choice (and management) of the keys; longer
keys will resist attack better than shorter keys.
Poor quality of biometric source images may affect the system performance.
8
1.5 Aims and Objectives
The aim of performing scientific research into imaging and security fields is to create
acceptance for, and quality of, fingerprint based authentication methods, with the
intention of meeting the trust and security requirements in information technology (IT)
transmission. The aims and objectives of this research can be summarized as follows:
2. A new algorithm for fingerprint image quality assessment which enhances the
overall performance of fingerprint based systems. The developed algorithm is
derived from power spectra of two dimensional Gabor features. It benefits from
the use of both Gabor and Fourier power spectrum methods, such as frequency
and orientation representations. Developed algorithm can effectively guide the
template selection at the enrolment stage and fingerprint image quality
classification for automatic parameters selection in fingerprint image pre-
processing. This work has been presented at [19], [20].
4. A new capsulation technique for fingerprint fuzzy vault key management. The
developed technique is used to secure both the secret key and the biometric
template by binding and shielding them within a cryptographic framework. The
technique used capsulation process to solve the problems of key management and
to distribute the level of security on the shields structure. Keys entropy depends
on level of shielding and polynomial degree of shielded secret key, while
encryption key depend on constructed vault entropy, and it is slightly more
efficient in terms of encryption/decryption speed because it used a heading
capsulation technique on covering the ciphertexts.
10
1.7 Thesis Outline
Chapter 2 surveys the development stages of bio crypt technique from validity check,
quality assessment to quality of service of keys construction. Chapter 3 provides a
detailed methodology of how to build aimed validity check approach for fingerprint
image benchmarking. This thesis has conducted experiments on a VTC2000DB1_B,
TIMA databases [21] & [22]. It also briefly reviews segmentation method as a
fundamental infrastructure for validity check approach. The characteristics of this
approach have been highlighted. Chapter 4 describes image quality measures, methods,
proposed Gabor spectrum approach for fingerprint image quality assessment. The
proposed algorithm is tested subjectively, objectively and reliably. Results are fully
discussed and a detailed summary is given in this chapter. Chapter 5 describes fingerprint
bio keys methods "releasing, generating and binding", in this chapter a minutiae based
generating approaches are proposed and investigated to address the problems of direct
key generation. Chapter 6 analyses a binding technique on base of fuzzy vault construct.
It shows a technique weakness and it discusses how to overcome these problems. A key
capsulation technique is also proposed to solve key management problems. Chapter 7
concludes the thesis, by summarizing the results obtained and indicating future
development.
11
Chapter 2 Literature Review
2.1 Introduction
For biometric applications and systems to be accurate, a biometric template must be
generated using a desirable bio pattern sample and qualified image source. A biometric
image quality assessment and validity analysis are defined as a predictor of an accuracy
and performance of biometric security system. Therefore, it is important to determine the
validity and quality of the input image during the enrolment stage, avoiding a mismatch
result later in the process. It is desirable to estimate the image quality of the fingerprint
image before it is processed for feature extraction. This helps in deciding on the type of
image enhancements that are needed and on the threshold levels for the matcher
performance, e.g. a sample’s quality score reflects the predictive positive or negative
contribution of an individual sample to the overall performance of a fingerprint matching
system. Investigations of fingerprint image validity analysis and quality estimation are
important techniques for crypto key system construction and judgment. Image validity
and quality are critical aspects in the crypto security environment where entire processes
are built around a captured fingerprint image as well as other authentication and
identification systems. This literature survey presents the fingerprint validity, quality
assessment and crypt construction based fields and to give a general description of the
various considerations on the development and implementation of fingerprint image
validity, quality assessment and fingerprint crypto based systems.
⎧⎡dx ⎤ ⎫ ⎡a c ⎤
C = E ⎨⎢ ⎥[dx dy ]⎬ = ⎢ ⎥.
2-1
⎩⎣dy ⎦ ⎭ ⎣ c b⎦
where E{•} ≡ 1 ∑•
N N
( a + b ) + ( a − b ) 2 + 4c 2 2-2
λ max =
2
( a + b ) − ( a − b ) 2 + 4c 2
λ min = 2-3
2
For a fingerprint image block; the ratio between λ min and λ max gives an orientation
certainty level, Equation (2-4). OCL gives an indication of how strong the energy is
concentrated along the ridge-valley orientation on certainty level. The lower the value the
13
stronger it is. It is obvious that OCL is between 0 and 1 as a, b>0. It is used to estimate
the orientation field and localize the region of interest (ROI) within the input fingerprint
image.
λ (a + b) − (a − b)2 + 4c 2
ocl = min = 2-4
λ max (a + b) + (a − b)2 + 4c 2
The certainty level of the orientation field in a block quantifies the extent to which the
pixel gradient orientations agree with the block gradient orientation. For each block, if its
certainty level of the orientation field is below a threshold, then all the pixels in this block
are marked as background pixels. As the computation of certainty level is a by-product of
the local ridge orientation estimation, it is a computationally efficient segmentation
approach. Performing the principal component analysis (PCA) approach can effectively
indicate the directional strength possessed by an image sub-block. However, it does not
guarantee any periodic layout of ridges and valleys. OCL is a good indicator of quality of
a fingerprint sample, it is still not sufficient. Therefore, there is a need to further examine
the ridge-valley structure of the fingerprint sample. Ridge valley structure analysis
performed on image blocks. Inside each block, an orientation line, which is perpendicular
to the ridge direction, is computed. At the centre of the block along the ridge direction, a
2-D vector V1 (slanted square in fingerprint orientation pattern) Figure (2-1), with size
32 × 13 pixels is extracted and transformed to a vertical aligned 2-D vector V2 . By using
equation (2-5), a 1-D vector V3 , that is the average profile of V2 , can be calculated.
14
Figure 2-1 Extraction of a local region and transformation to vertical aligned pattern
m
∑V (i, j )
2 2-5
V3 (i ) = j =1
, i = 1,..32
m
where m is the block height (13 pixels) and i is the horizontal index.
Once V3 has been calculated, linear regression is then applied on V3 to find the
Determine Threshold (DT1 ) which is a local threshold for the block. DT1 is the line
positioned at the centre of the Vector V3 , and is used to segment the image block into the
ridge or valley region. Regions with grey level intensity lower than DT1 are classified as
ridges; else they are classified as valleys. The process of segmenting the fingerprint
region into ridge and valley using DT1 is shown in Figure (2-2). From the one-
dimensional signal in Figure (2-2), several useful parameters are computed, such as
15
valley thickness and ridge thickness. Since good finger images cannot have ridges that
are too close or too far apart, thus the nominal ridge and valley thickness can be used as a
measure of the quality of the finger image captured. Similarly, ridges that are
unreasonably thick or thin indicate that the finger image may not be captured properly or
is a residual sample.
Thus, the finger image quality can be determined by comparing the ridge and valley
thickness to each of their nominal range of values. Any value out of the nominal range
may imply a bad quality ridge pattern. Figure (2-3) shows the grey level distribution of
the segmented ridge and valley. The overlapping area is the region of potential
misclassification since in this region, whether a pixel belongs to ridge or valley cannot be
accurately determined using DT1 . Hence, the area of the overlapping region can be an
indicator of the clarity of ridge and valley, subject to the ridge and valley thicknesses
being within the acceptable range. Shen et al. [25] divided the fingerprint image
into N (blocks ) and applied Gabor filtering to image sub-blocks. Gabor features of each
block are computed first, and then the standard deviation of the Gabor features is used to
determine the quality of this block. They conclude that a good quality block with clear
repetition of ridge and valley pattern can be identified by the output of a Gabor filter bank.
The mathematical conclusion of the previous method is as follows:
16
The general form of a 2D Gabor filter is defined by
⎡ ⎞⎤
1 ⎜⎛ x θ k
2
yθ k 2
( ) ⎢
h x , y , θ k , f , σ x , σ y = exp − ⎜
⎢ 2⎜ σ 2
+ ⎟⎟ ⎥ (
⎟ ⎥ × exp i 2 πfx ,
θk) 2-6
⎝ x σ y2 ⎠⎦
⎣
k = 1,..., m
Where xθ k = x cos θ k + y sin θ k and yθ k = − x sin θ k + y cos θ k , f is the frequency of
the sinusoidal plane wave, m denotes the number of orientations, θ k is the k th orientation
of the Gabor filter, σ x and σ y are the standard deviations of the Gaussian envelope along
the x and y axes, respectively. After obtaining m Gabor features, g θ k , of the block, the
2
⎛ 1 m ⎞ 1 m
⎜ m − 1 ∑ θk
G=⎜ g (
− g 2⎟
θ ⎟ , g )
θ =
m
∑ gθk
2-7
⎝ k =1 ⎠ k =1
where θ k = π (k − 1) / m, k = 1, ...., m
They compute the value of G for each block. If G is less than a block threshold value
(Tb ), the block is marked as a background block, otherwise the block is marked as a
foreground block. The quality field for the fingerprint image in Figure (2-4 (a)) is shown
in Figure (2-4(b)). The segmented image is shown in Figure (2-4(c)).
The quality field value for a foreground block is defined to have one of the following
values: “good” and “poor”. A block is marked as a “poor” quality block if its G value is
17
less than a preset quality threshold (Tq ), otherwise it is marked as a “good” quality block.
A fingerprint image is marked as a “good” quality image if the QI value is bigger than a
threshold TQ , otherwise it’s marked as a “poor” quality image. The choice of Tq , and
TQ were determined experimentally. Shen et al. [25], Qi et al. [26] categorized the poor
quality fingerprint images into smudge and dry images according to smudginess and
dryness indices, (SI , DI ) , where SI , DI are used to determine whether this image
consists of a large number of dry or smudged blocks. The idea is that for a smudged
block, most ridges are connected with each other, so that the mean value of the block is
small. While for a dry block, some of ridges are disjointed and the mean value of the
block will be larger. A poor block is marked as a smudged block if it’s mean value is less
than a preset smudged threshold Ts , while a poor block is marked as a dry block if its
mean value is larger than the preset dry threshold Td . Both Ts and Td are determined by
the mean value of the foreground blocks of the image.
Two thresholds TS and TD were chosen empirically to determine the type of a poor quality
and DI < TD , the image is marked as smudged. If SI < TS and DI ≥ TD , the image is
marked as dry. Shen et al in their proposed method used fingerprint local orientation
information for image segmentation and quality specification. Qi et al. [26] combined
18
quality calculation of both local and global features of a fingerprint image. Their hybrid
method combined the quality indices of local information (e.g. Gabor feature, smudginess
and dryness) and global information (e.g. foreground area, central position of foreground
index; minutiae count index and singular point index). The seven quality indices are
mathematically calculated as follows:
⎛ N FA G (i ) ⎞
∑
min⎜⎜ i =1 , Tave ⎟⎟ 2-11
⎜ N FA ⎟
Q1 = ⎝ ⎠
Tave
where N FA is the number of foreground blocks
2. The smudginess and dryness indices are calculated by Equations (2-9), (2-10)
respectively where the quality of smudginess Q2 and quality of dryness Q3 are
computed by:
Q 2 = 1 − SI 2-12
N FS
where SI = , N FS is the number of smudgy foreground sub
N FA
Q3 = 1 − DI 2-13
N FD
where DI = , N FD is the number of dry foreground sub blocks
N FA
19
N FA 2-14
Q4 =
N
where N FA is the number of foreground blocks which counted
according to the rules given in [25] and N is total blocks.
xc − width 2-15
2
Q5x = 1−
width
2
y c − height
2 2-16
Q5y = 1 −
height
2
5. Minutiae count index which depends on the quantified relation between really
extracted minutiae nmc count and expected minutiae count E mc where
min (nmc , E mc )
Q6 = 2-17
E mc
6. Singular point (SP) index quality calculated according to the following rules
7. Finally, the overall image quality is the combining value of seven quality indices
Equation (2-19).
20
7
Q ∑ϖ i Qi 2-19
i =1
Nill et al. [27] proposed an objective image quality assessment based on the digital image
power of normally acquires scenes. Their system is designed to compute image quality
based on the two dimensional, spatial frequency power spectrum of the digital image.
The power spectrum, which is the square of the magnitude of the Fourier transform of the
image, contains information on the sharpness, contrast, and detail rendition of the image
and these are the components of visual image quality, i.e. image global information.
Their approach was implemented on fingerprint images as Image Quality of Fingerprint
(IQF) [28]. In IQF, the power spectrum is normalized by image contrast, average gray
level (brightness), and image size; a visual response function filter is applied, and the
pixels per inch (PPI) resolution scale of the fingerprint image is taken into account. The
fundamental output of IQF is a single-number image quality value which is the sum of
the filtered, scaled, weighted power spectrum values. The power spectrum normalizations
allow valid inter-comparisons between disparate fingerprint images. IQF processing steps
start with acquisitioned live scan or inked image, i.e. raw format image then locating the
approximate vertical and horizontal edges of the fingerprint image to identify the ROI of
fingerprint image, define a set of overlapping windows that covering entire fingerprint
area into sub images, weed out a low structure windows and finally a computing process
of image quality, i.e. window power spectrum computation, normalization, incorporation
with human visual system (HVS) by applying a HVS filter, and image quality weighting
and scaling by pixel per inch. A major benefit of an image quality measure based on
image power spectrum is that it is applied to the naturally imaged scene. It does not
require use of designed quality assessment targets or re-imaging the same scene for
comparison purposes; it requires only a selection of an image area containing some
structure, i.e. it is blind image quality assessment method. Chen et al [29] analyzed
fingerprint Global structure by computing its 2D Discrete Fourier Transform (DFT). For
21
a fingerprint image, the ridge frequency value lies within a certain range. ROI of the
spectrum is defined as an annular region with radius ranging between the minimum and
maximum typical ridge frequency values Figures(2-5, 2-6), images from [22]. As
fingerprint image quality increases, the energy will be more concentrated in ring patterns
within the ROI. The global quality was measured by the energy concentration in ring-
shaped regions of the ROI therefore a set of constructed bandpass filters to compute the
amount of energy in ring-shaped bands. Good quality images will have the energy
concentrated in few bands. Chen et al. [29] used the power spectrum method which
represent the magnitude of various frequency components of a 2D fingerprint image that
has been transformed with the Fast Fourier Transform from the spatial domain into the
frequency domain. Different frequencies in the power spectrum are located at different
distances and directions from the origin, i.e. the centre of power spectrum. Higher
frequency components of the image will be located at greater distances from the origin.
Different directions from the origin will represent different orientations of features in the
image. The power at each location in the power spectrum is an indication of the
frequency and orientation of a particular feature in the image. The power spectrum
S f (u , v ) of a M × M point digital image f [x, y ] can be computed as the magnitude
2
M −1 M −1 − 2π (ux + vy ) 2-20
S f (u, v ) = ∑ ∑ f [x, y ] e M
x =0 y =0
M M
where u , v = − ...
2 2
Evaluating the power spectrum is an excellent way to isolate periodic structural features
or noise in the image [27]. Since the power can vary by orders of magnitude in an image,
the power spectrum is usually represented on a log scale Figures (2-5, 2-6). The power
spectrum approach does not depend on imaging designed targets, does not require
detection and isolation of naturally occurring targets, and does not require re-imaging the
same scene for comparison purposes. This approach is useful for distinguishing the total
22
direction and the consistency of the fingerprint ridges and valleys because it is based on
the use of the frequency characteristics [30]. A ring in Fourier spectrum is the indicating
factor of the quality of image itself, incase of good quality images it is clearly appearing
around the origin. In contrast, bad quality images do not produce a ring in Fourier
spectrum. This is due to the fact that bad quality images generally have less uniform and
less periodic ridge-valley structure than good fingerprint images.
Lee et al. [30] and Joun et al. [31] used local contrast measurement in terms of contrast of
the gray values between the ridges and valleys along the orientation of the ridge flow.
The idea behind their approach is that, high directional contrast shows good quality
orientation while low contrast shows bad quality. Mathematically, this approach is
represented as the following equations:
23
S i ( x, y ) = ∑ G (Pij )
8
2-21
j =1
where i = 1,...8
G (Pij ) denotes the gray value of the pixel corresponding to a position Pij in an 8
directional window that is used to compute the directional contrast. For each 8 × 8 block,
the local gray value θ i is calculated using equation (2-22), and the biggest value
8 8
θ i = ∑∑ S i ( x, y ) 2-22
x =1 y =1
The directional contrast Dk will be obtained from the difference between θ max and θ i at
where N is the number of blocks, θ i is the direction perpendicular to θ max . Finally quality
measure QDC of the whole fingerprint image is calculated by normalizing the sum of Dk ,
equation 2-24.
1 N
Dk = ∑ Dk
c x =1
2-24
where c is certain normalization factor so that the final result is in [0, 1].
Ratha et al. [32] present a method of quality estimation from wavelet compressed
fingerprint images. However, it's not a desirable approach for uncompressed fingerprint
image databases since the wavelet transform consumes much computation. They observe
that a significant fraction of the normalized cumulative spectral energy is within the first
few sub bands of a wavelet scale quantization (WSQ) compressed good quality
fingerprint image. Accordingly, they design rotation invariant criteria to distinguish
smudged and blurred fingerprint images. Ratha et al. [33] described a pixel intensity
24
method of fingerprint image quality computation. Pixel intensity method classifies image
blocks into directional and non-directional as follows. The sum of intensity
differences Dd (i, j ) between a pixel (i, j ) and l pixels selected along a line segment of
( )
where d = 0, π ,..., π and where f (i, j ) is the intensity of pixel (i, j ) and f d i ' , j ' are
n
the intensities of the neighbours of pixel (i, j ) along direction d . For each different
orientation d , the histogram of Dd (i, j ) values is obtained for all pixels within a given
foreground block. If only one of the n histograms has a maximum value greater than a
prominent threshold, the block is marked as “directional”. Otherwise, the block is marked
as “non-directional” [34]. The overall quality of the fingerprint image is computed from
directional blocks by assigning a relative weight wi for foreground block i at location xi ,
given by:
2
xi − x c 2-26
wi = e −
2q 2
where xc is the centroid of foreground, and q is a normalization constant.
Q=∑D
wi
2-27
∑ F wi
where D is the set of directional blocks and F is the set of foreground blocks.
25
image is considered to be of poor quality. Tabassi et al. [35, 36] used a classifier method
to define the quality measures as a degree of separation between the match and non-
match distribution of a given fingerprint. This can be seen as a prediction of the matcher
performance. Their method was implemented on neural network based and released by
The National Institute of Standards and Technology as Fingerprint Image Quality
package (NFIQ) [37], where a novel strategy for estimating fingerprint image quality
presented. Image quality map is generated by minutiae detection (MINDTCT) for quality
measurement of localized regions in the image by determining the directional flow of
ridges and detecting regions of low contrast, low ridge flow, and high curvature. Image
quality map formulated based on feature extraction which compute fingerprint image
fidelity characteristics and results in an 11-diemensional feature vector, as shown in
Table 2-1.
Name Description
quality i
2 total of minutia number of total minutiae found in the fingerprint
Neural network block that classifies feature vectors into five classes of quality based on
various quantities of normalized match score distribution Figure (2-7). The final general
26
map contains an integer value between 1(highest) and 5 (poorest). The quality measure
can be seen as a prediction of matcher performance. This approach uses both local and
global features to estimate the quality of a fingerprint images. Zhu et al. [38] proposed a
neural network based fingerprint image quality estimation, which estimates the
correctness of ridge orientation of each local image block using neural network and then
computes the global image quality based on the local orientation correctness.
27
implementation. In an enrolment apparatus, the unique number, for use in generating the
public key and private key of the system, is generated by manipulation of fingerprint
information of a subscriber. A filter is then generated which is a function of both the
Fourier transform of the subscriber's fingerprint(s) and of a unique number. This filter is
stored on a subscriber card. When the subscriber wishes to generate his public or private
key, he inputs his card to a card reader of an apparatus and places his finger(s) on a
fingerprint input. The apparatus generates an optical Fourier transform from the
fingerprint input. The Fourier transform signal is incident on to a spatial light modulator
programmed with the filter information from the card. An inverse transform is generated
from the filtered signal and this is used to regenerate the unique number. The apparatus
also has a subsystem for utilizing the private key to decrypt an input encrypted message.
Soutar et al. [12] proposed biometric encryption algorithm using image processing. This
algorithm binds a cryptographic key with the user’s fingerprint images at the time of
enrolment. The key is then retrieved only upon a successful authentication. Biometric
Encryption (BE) has been developed to securely link and retrieve a digital key using the
iteration of a biometric image, such as a fingerprint, with a secure block of data, known
as a Bioscrypt. The key can be used as an encryption- decryption key. The Bioscrypt
comprises a filter function, which is calculated using an image processing algorithm, and
other information which is required to first retrieve, and then verify the validity of the key.
The key is retrieved using information from the output pattern formed via the interaction
of the biometric image with the filter function. Soutar et al. [15] proposed a merging of
biometrics with cryptography by using a biometric to secure the cryptographic key.
Instead of entering a password to access the cryptographic key, the use of this key is
guarded by biometric authentication. Key release is dependent on the result of the
verification part of the system. Thus, biometric authentication can replace the use of
passwords to secure a key. The proposed algorithm offers both conveniences, as the user
no longer has to remember a password, and secure identity confirmation, since only the
valid user can release the key. BE [12, 15] processes the entire fingerprint image. The
mechanism of correlation is used as the basis for the BE algorithm. The correlation
function c( x ), between a subsequent version of the input f1 ( x ) obtained during
verification and f 0 ( x ) obtained during an enrolment is formally defined as
28
∞
2-28
c( x ) = ∫ f1 (v ) f 0 (x + v )dv
*
−∞
where * denotes the complex conjugate.
In a practical correlation system, the system output is computed as the inverse Fourier
transform (FT −1 ) of the product of F1 (u ) and F0* (u ) , where
{
c(x ) = FT −1 F1 (u )F0* (u ) } 2-29
where F0* (u ) is typically represented by the filter function, H (u ) , that is derived from
f 0 ( x ) . For correlation based biometric systems, the biometric template used for
29
and purposes without fear that these separate identifiers or users will be linked together
by a single biometric image or template. Thus, if a single account identifier becomes
compromised, there is far less risk that all the other accounts will also be compromised.
Even better, Biometric Encryption technologies make possible the ability to change or
recompute account identifiers. That is, identifiers may be revoked or cancelled, and
substituted for newly generated ones calculated from the same biometric! Traditional
biometric systems simply cannot do this. Costanzo [47] proposed an approach for
generating a cryptographic key from an individual's biometric for use in proven
symmetric cipher algorithms. According to this approach Figure (2-8), the encryption
process begins with the acquisition of the required biometric samples.
Features and parameters are extracted from these samples and used to derive a biometric
key that can be used to encrypt a plaintext message and its header information. The
decryption process starts with the acquisition of additional biometric samples from which
the same features and parameters are extracted and used to produce a “noisy” key as done
30
in the encryption process. Next, a small set of permutations of the “noisy” key are
computed. These keys are used to decrypt the header information and determine the
validity of the key. If the header is determined to be valid, then the rest of the message is
decrypted. The proposed approach eliminates the need for biometric matching algorithms,
reduces the cost associated with lost keys, and addresses non-repudiation issues. In Key
Generation based on biometric aggregation [47], several invariant features of different
types of biometric are used to derive a bio-key that is used to encrypt a plain text message
with header information. The decryption is based on a new generated bio-key which may
not be exactly the same as the initial key. Different permutations of the newly computed
bio-key are used to decrypt the header of the encrypted message after which the rest of
the message is inferred. This approach was shown efficient and addressed the non-
repudiation problems. However, to be robust this scheme needs several biometrics.
Davida et al. [44], [48] proposed an algorithm based on the iris biometric. They
considered binary representation of iris texture, called Iris Code [49], which is 256 bytes
in length. The biometric matcher computes the Hamming distance between the input and
database template representations and compares it with a threshold to determine whether
the two biometric samples are from the same person or not. The authors assume that the
Iris Codes from different sampling of the same iris can have up to 10% error rate of the
256 byte vectors which means (204 bits) different from the same iris’s template Iris Code.
The authors also assume that the Iris Codes of different irises differ in as many as 45% of
the 256 bytes (922 bits). Davida et al [44], [48] argue that the database template of a user
itself can be used as a cryptographic key (note that this key would always be the same for
the same biometric identifier in contrast to cryptographic key binding algorithms such as
biometric encryption algorithm. The main criticism of Davida et al.’s work is that they
assumed that the input and database template Iris Codes are completely aligned.
Although constrained iris image acquisition systems can limit the misalignment among
different acquisitions of the same iris, some degree of misalignment is natural. They have
ignored this fact in their algorithm. Another criticism of Davida et al.’s work in [50] is
that no concrete implementation work was reported, and it was found that the majority of
coding does not work with real iris data as errors are strongly correlated. Monrose et al.
[51] proposed a novel approach to improving the security of passwords by combining a
31
short binary string which derived from a keystroke biometrics with passwords. In their
approach, the legitimate user's typing patterns (e.g., durations of keystrokes, and latencies
between keystrokes) are combined with the user's password ( pwd ) to generate a
hardened password (hpwd ) that is convincingly more secure than conventional
passwords against both online and offline attackers. During enrolment, the following
information is stored in the user’s database template: 1) a randomly chosen large prime
number (r ) length 160 bit; 2) an “instruction table” which created on base of secret
sharing scheme then encrypted with pwd , the instruction table is created using user’s
keystroke features (the measurable keystroke features for an 8-character password are
relatively few at most 15 on standard keyboards). These features are thresholded to
generate a binary feature descriptor, then the binary feature descriptors are used to create
the instruction table using Shamir’s secret sharing scheme [8]; and 3) an encrypted
“history file” that contains the measurements for all features. At the time of
authentication, the algorithm uses (r ) and the instruction table from the user’s template
and the authentication password ( pwd )' and keystroke features acquired during the
authentication to compute (hpwd )' . The (hpwd )' is used to decrypt the encrypted history
file. If the decryption is successful, the authentication is successful, and the (r ) and
history file of the user are modified in the template; if the authentication is unsuccessful,
another instance of (hpwd )' is generated from the instruction table in a similar way but
with some error correction, and the authentication is tried again. If the authentication
does not succeed within a fixed number of error-correction iterations, the authentication
finally fails. The authors claim that the hardened password itself can be used as an
encryption key. A weakness of this work is that it only adds about 15 bits of entropy to
the passwords, thus making them only marginally more secure. However, in [52],
Monrose et al. made some minor modifications to their original scheme, applied it to
spoken password, i.e. voice biometrics (which is more distinctive than keystroke
biometrics), and were eventually able to generate cryptographic keys of up to 60 bits,
which although much higher than the 15 bits achieved in their earlier work, is still quite
low for most security applications. Keystroke patterns are also used for the purpose of
32
authenticating users accessing a computer system [53]. Keystroke rhythms are a method
that tries to understand individual's behaviour. In [53], biometric data is assigned to a
vector which carries all well known values of property. By using a minimum distance
classifier, it will be easy to make a decision by finding the distance between the test
pattern and the templates of each individual which are previously determined after a
training phase. Proposed approach in [53] has four major steps. In the first step,
parameters of users’ keystroke are collected using a login form and stored in a database.
Next step is the validation step where the users’ parameters are processed by an efficient
validation algorithm. At the end of this stage, new parameters are generated. In the
decision making step, new values calculated during the validation phase are transferred to
a decision function. In this step user is accepted or rejected. Final step, the parameters
belong to the successful login are updated in the database. Keystroke pattern are low cost
user specific data especially for biometric authentication and cryptography and it should
be noted that they are usually difficult to detect and analyze. Similar to image type
biometrics, human voice is a good biometric to generate a cryptographic key [52, 54]. In
[55], Hao et al. made use of handwritten signatures. They defined forty-three signature
features extracted from dynamic information like velocity, pressure, altitude and azimuth.
Feature coding was used to quantize each feature into bits, which were concatenated to
form a binary string. Their key achieved on average 40-bit key entropy with a 28% false
rejection rate; the false acceptance rate was about 1.2%. Derived key performs shape
matching to rule out poor-quality signatures in the initial verification phase. The authors
claim an Equal Error Rate (EER) of 8%, and mention that their test database contains
forgeries, but unfortunately provide no details on how these were produced or their
quality. Kuan et al. [56, 57] presented a method for replaceable generating cryptographic
keys from dynamic handwritten signature that can be replaced if the keys are
compromised and without requiring a template signature to be stored or any statistical
information that could be used to reconstruct the biometric data. Their replaceable key is
accomplished using iterative inner product of Biohash method, and modified multiple-bit
discretization that deters guessing from key statistics. They got encouraging results
especially for skilled and random forgery whereby the equal error rates are <6.7% and
~0% respectively, indicating that the keys generated are sufficiently distinguishable from
33
impostor keys. Some work on cryptographic key generation was done toward a fuzzy
vault technique (which will be reviewed later) based on [52]. Chang et al. [58] proposed a
framework to generate stable cryptographic keys from biometric data that is unstable in
nature. Their proposed framework differs from prior work in that user-dependent
transforms are utilized to generate more compact and distinguishable features. Thereby, a
longer and more stable bit stream can be generated as the cryptographic key. For
feasibility verification, a proposed framework was performed on a face database.
However [52] and [58] did not address the issue of setting the thresholds for
distinguishable features, this issue was tackled by Zhang et al.'s work in [59], they
proposed a method to minimize the authentication error rate in terms of the false accept
rate and the false reject rate of the bio key generation system by setting optimal
thresholds of each feature. Previous reviewed works assumed that enrolled templates are
noise free, and aligned. To turn noisy information into usable keys for any cryptographic
application and, in particular, reliably and securely authenticating biometric data, Dodis
et al. [60] proposed theoretical foundations for generating keys from the key material that
is not exactly reproducible. They provided formal definitions and efficient secured
techniques for cryptographic key generation. They defined fuzzy extractors (FE) to
generate a variable (R ) from the key material (w) , and public (helper) data (P ) . Given the
variable (P ) , FE again generates (R ) from (w)' , if (w)' is “close” to (w) . For three distance
metrics (Hamming distance, set difference and edit distance), Dodis et al. calculated the
information revealed by (P ) , and elaborated on the existence of possible algorithms for
FE construction. They also proposed a modification of the Juels and Sudan’s fuzzy vault
scheme [45]: instead of adding chaff points to the projections of the polynomial ( p ) ,
Dodis et al. [60] proposed to use a polynomial ( p )' (of degree higher than ( p ) ) which
overlaps with ( p ) only for the points from the genuine set ( A) . This new polynomial ( p )'
replaces the final point set (R ) of Juels and Sudan’s scheme [45]. Juels and Sudan’s
fuzzy vault scheme [45] is an improvement upon the previous work by Juels and
Wattenberg [61]. In [45], Alice can place a secret (k ) (e.g., secret encryption key) in a
vault and lock (secure) it using an unordered set ( A) . Here, unordered set means that the
34
relative positions of set elements do not change the characteristics of the set: e.g., the set
{− 2,−1,3} conveys the same information as {3,−1,−2}. Bob, using an unordered set (B ) ,
can unlock the vault (access (k ) ) only if (B ) overlaps with ( A) to a great extent. The
procedure for constructing the fuzzy vault is as follows: First, Alice selects a
polynomial ( p ) of variable (x ) that encodes (k ) (e.g., by fixing the coefficients of p
according to (k ) ). She computes the polynomial projections, p( A) for the elements of ( A) .
She adds some randomly generated chaff points that do not lie on ( p ) , to arrive at the
final point set (R ) . When Bob tries to learn (k ) (i.e., find ( p ) ), he uses his own unordered
set (B ) . If (B ) and ( A) substantially overlaps, he will be able to locate many points in (R )
that lie on ( p ) . Using error-correction coding (e.g., Reed- Solomon [62]), it is assumed
that he can reconstruct ( p ) (and hence (k ) ). As example, assume Alice selects the
35
vault scheme requires pre-aligned biometric templates. Namely, the biometric data at the
time of enrolment (locking) must be properly aligned with biometric data at the time of
verification (unlocking). This is a very difficult problem due to different types of
distortion that can occur in biometric data acquisition. Further, the number of feasible
operating points (where the vault operates with negligible complexity, e.g., conveyed via
the number of required access attempts to reveal the secret, for a genuine user and with
considerable complexity for an impostor user) for the fuzzy vault is limited: for example,
the flexibility of a traditional biometric matcher (e.g., obtained by changing the system
decision threshold) is not present. Based on the fuzzy vault scheme, Clancy et al. [63]
proposed a fingerprint vault using multiple minutiae location sets per finger (based on 5
impressions of a finger), they first find the canonical positions of minutia, and use these
as the elements of the set ( A) . They add the maximum number of chaff points to find (R )
that locks (k ) . However, their system inherently assumes that fingerprints (the one that
locks the vault and the one that tries to unlock it) are pre-aligned. This is not a realistic
assumption for fingerprint-based authentication schemes. Clancy et al. [63] simulated the
error-correction step without actually implementing it. They found that 69-bit security
(for False Accept Rate (FAR)) could be achieved with a False Reject Rate (FRR) of 20-
30%. Note that the cited security translates to 2 −69 ≈ 1.7 ∗ 10 −21 FAR. Further, FRR value
suggests that a genuine user may need to present his/her finger multiple times to unlock
the vault. Uludg and Jain [64] used lines based fingerprint minutiae representation to
design fuzzy vault system Figure (2-9) but it was without the actual implementation. It
differs from Clancy system in the way that both location and angle of minutiae are used
to extract lines for forming the templates. Uludag et al. [65] present their implementation
of fuzzy vault, operating on the fingerprint minutiae features. These features are
represented as ( x, y , θ ) of ridge ending or bifurcation, where (x, y ) is minutiae
coordination and (θ ) is the angle of the associated ridge Figure (2-10).
36
Figure 2-9 Fuzzy vault system block diagram.
They extend [64] into [65] where chaff points generated according to minutiae points and
protected secret, which is clear in secret check block (cyclic redundancy check encoding),
and chaff generation block. [65] differ from [45] work’s in decoding implementation
does not include any correction scheme, since there are serious difficulties to achieve
error-correction with biometric data. Developing the necessary polynomial reconstruction
via error-correction has not been demonstrated in the literature. Fuzzy vault for
fingerprint decodes many candidate secrets Figure (2-11). To identify which candidate is
valid a Cyclic Redundancy Check (CRC) is used. CRC is commonly used in error
correction. In proposed system using incorrect minutiae points during decoding will
cause an incorrect polynomial reconstruction, resulting in errors. Uludag et al. in [65],
generate 16-bit CRC data from the secret S . Hence, the chance of a random error being
undetected is 2 −16 . The 16-bit primitive polynomial, which is the minimal polynomial of
to arrive at the 16 bit locking / unlocking data unit (u ) . SC is used to find the coefficient
37
of the polynomial p : 144-bit SC can be represented as a polynomial with 9 (144 / 16 )
38
(a)
(b)
Figure 2-11 Fuzzy fingerprint vault : (a) vault encoding, (b) vault decoding [65]
Hence,
39
Evaluating p (u ) on the template minutiae features (T ) to get genuine set G , starting with
N template minutiae sorted according to ascending u values, u1 , u 2 ,..., u N , G founded to
be:
field GF (216 ) , with the constraint that they do not overlap with the u1 , u 2 ,..., u N ,
( )
d1 , d 2 ,..., d M , with the constraint that the pairs c j , d j , j = 1,2,..., M don’t fall onto the
a list scrambler which randomizes the list, with the aim of removing any stray
information that can be used to separate chaff points from genuine points. This results in
vault set VS ,
Along with VS , the polynomial degree D forms the final vault V . In unlocking part of
{
proposed system the vault V using N queries minutiae Q = u1* , u 2* ,..., u *N . The points to }
be used in polynomial reconstruction are found by comparing u i* , i = 1,2,..., N with the
abscissa values of the vault V , namely vl , l = 1,2,..., ( N + M ) . If any ui* is equal to vl , the
corresponding vault point (vl, wl ) is added to the list; has K points, where K ≤ N . For
decoding a degree D polynomial, (D + 1) unique projections are necessary. All possible
⎛ K ⎞
combination of (D + 1) was founded, among the list with size K , resulting in ⎜ D + 1⎟
⎝ ⎠
combinations. Lagrange interpolating polynomial was constructed for each combination,
and it was given for
40
L = {(v1, w1),(v2, w2 ),...,(vD+1, wD+1)} 2-33
(u − v2 )(u − v3 )...(u − vD +1 )
p* (u ) = w ... 2-34
(v1 − v2 )(v1 − v3 )...(v1 − vD +1 ) 1
(u − v1 )(u − v2 )...(u − vD +1 )
+ w
(vD +1 − v1 )(vD +1 − v2 )...(vD +1 − vD ) D +1
This calculation is carried out in the Galois field, GF 216 ( ) to yield polynomial
coefficients. The coefficients are mapped back to the decoded secret. For checking
whether there are errors in this secret, a CRC primitive polynomial should be applied.
Due to the definition of CRC, if the remainder is not zero, it is certain that there are
errors. If the reminder is zero, there are no errors. In general if the query minutiae Q
overlap with template minutiae T in at least (D + 1) points for some combinations, the
correct secret will be decoded, namely, S * = S will be obtained. This denotes the desired
outcome when query and template fingerprints are from the same finger. Proposed work
in [65] suffers from complexity and alignment problems. They claimed that the
complexity of attacks that can be launched by impostor users is high. It includes high
time complexity due to the need for evaluating multiple point combinations during
decoding. In [66], Uludag and Jain proposed a new biometric cryptosystem designed to
overcome the security and privacy problems of previous biometric systems. They
proposed to protect the biometric templates as a transformed version of the original
template within a cryptographic framework. Their implementation of fuzzy fingerprint
vault used orientation field to derive the helper data which used to allow an alignment
between query and template as an automatic solution of fuzzy vault alignment. Utilizing
maximum curvature information (invariant to translation and rotation of fingerprints) of
orientation field flow curves, the query fingerprint aligned with respect to the template
via a variant of Iterative Closest Point (ICP) algorithm. Their alignment routine achieves
reasonable accuracy, considering the small amount of data used for alignment. Further,
the helper data does not leak any information about the minutiae-based fingerprint
41
template. The criticism of [66], is that it is not sufficient to handle distortion and
deformation of the fingerprint ridge increases as we move away from the centre of the
fingerprint area towards the periphery. As well the designed system was dependent on
user habituation and cooperation to increase the authentication accuracy. The system was
developed for a positive identification scenario where the user is expected to be
cooperative (for user convenience); the false rejects will reduce with increased user
cooperation. Chung et al. [67] proposed a geometric hashing technique to perform
alignment in a minutiae-based fingerprint fuzzy vault but still has the problem of limited
security. That is, the maximum number of hiding points (chaff points) for hiding the real
fingerprint minutiae is limited by the size of the fingerprint sensor meanwhile the size of
the fingerprint images captured and the possible degradation of the verification accuracy
caused by the added chaff minutiae. All approaches in [63, 64, 67] assumed the number
of chaff points was 200. Lee et al [68] proposed both the automatic alignment of
fingerprint data and higher security by using a 3D geometric hash table. A number of
chaff points for the proposed approach were more than in previous approaches by two
times, as well as a complexity of cracking the proposed system was very high.
2.4 Summary
Cryptography and biometrics have been identified as two of the most important aspects
of digital security environment, for various types of security problems the merging
between cryptography and biometrics has led to the development of Bio-Crypto
technology. The new technology suffers from several limitations e.g. biometric image
based quality, validity, image alignment, cancelability, key revoking and repeatability.
Therefore, the literature review is following the merging technology life cycle, it started
with quality and validity analysis. This part reviews existing approaches for fingerprint
image-quality estimation, including the rationale behind the published measures and
visual examples showing their behaviour under different quality conditions. To the best
of author’s knowledge, all published works are tackling the validity issue entire quality
assessment, they assumed that all images are valid and the need just for quality
assessment. Quality assessment was conducted in both field of information, e.g. local and
global characteristics. The second part of reviewing according to the bio-crypt life cycle
42
is Bio-crypt development approaches, where literature review divided it into three
categories: Key hidden, one way function generator and Fuzzy key generation or on
based of merging technique as: (1) loosely-coupled mode (biometric key release), the
biometric matching is decupled from the cryptographic part. Biometric matching operates
on the traditional biometric template: if they match, cryptographic key release from it is
secure location, e.g. a server or smart card. (2) tightly-coupled mode (biometric key
generation), biometric and cryptography are merged together at a much deeper level,
where matching can effectively take place within cryptographic domain, hence there is no
separate matching operation that can be attacked; key extracted from a collected
heterogeneous mass (key/bio template) as a result of positive matching. The literature
review highlights the remarkable problems and challenges that face the biometric
cryptography such as:
43
Chapter 3 Fingerprint Image Analysis
3.1 Introduction
Fingerprint is one of the oldest and most widely used biometric traits. A modern
scientific fingerprint technology in the acquisition stage of system infrastructure is used
due to low cost and simplicity of operation [2]. For this purpose, a wide range of sensors
are available commercially to attain a digital fingerprint image which makes it easy to
obtain and then accept or reject the fingerprint image for further processing. Clarification
of fingerprint image structure is crucial for many fingerprint applications, as well as the
performance of built systems which relies on the validity and quality of captured images.
Validity check will eliminate invalid images before starting the life cycle of fingerprint
metadata enrolling for system processing cycle; therefore the overall benchmarking
system accuracy will not be affected by rejecting an invalid image before getting in the
system cycle. This chapter explains the basic characteristics of fingerprint images from
local and global analysis points of view as well as the relationship between these factors
and validity check results. A fingerprint is a group of associated curves. The bright curves
are called valleys while the dark curves are called ridges Figure (3.1). Fingerprint local
structure constitutes the main texture like pattern of ridge and valley i.e. detailed pattern
around a minutiae point, while valid global structure puts the ridges and valleys into
smooth flow or the overall pattern of the ridges and valleys. Ridge to valley structure is
analysed to detect image validity values while image quality is justified by its local and
global structure. To study the locality and globality of the fingerprint pattern, we first
define the fingerprint representation area where we can detect the region of interest (ROI);
the image area without effective ridges and furrows is first discarded since it only holds
background information. Then the bound of the remaining effective area is sketched out
since the minutiae in the bound region are confusing with those spurious minutiae that are
generated when the ridges are out of the sensor. ROI detection and segmentation
described in section 3.3. The fingerprint pattern locality and global introduced in section
44
3.4. A proposed validity check algorithm based on ridge valley statistical weight analysis
is discussed in section 3.4. Finally, Section 3.5 provides a summary and discussion of this
chapter.
45
searched or where all the macro features are found (e.g. Ridge patterns, Ridge pattern
area, Core point, Delta point, Type lines and Ridge count). The accurate search area is the
whole perfect ridge pattern area. It is normally defined by diverging ridge flows that form
a delta. It is designed to account for detected feature position deviations due to noise,
processing variations. Increasing the search area is equivalent to reducing the scanning
resolution and reducing the accuracy of detection of the feature position.
47
background, i.e., the histogram of local region contrasts must have two pinnacles. It is
clear that thresholding is a fundamental tool for segmentation of grey level images when
objects and background pixels can be distinguished by their grey level values. Given a
digital image I (i, j ) , of dimension N x × N y , so I (i, j ) represents the intensity at location
maximum number of grey levels, and K = log 2 (L ) is usually termed as the pixel depth or
the number of bits/pixel for the image. Grey-level-based method working on base of
quantifying the local contrast histogram into 0 ~ L − 1 level, with the assumption of the
mean of contrast is T0 , where Ti +1 is calculated by:
⎧ Ti L −1i ⎫ 3-1
⎪ ∑k ⋅ hk ∑k ⋅ hk ⎪
1⎪ k =T ⎪
Ti +1 = ⎨ k =T0 + Li−+11 ⎬
2⎪ i
∑ h k ⎪⎪
i
⎪ ∑hk
⎩ k =0 k =Ti +1 ⎭
where hk is the number of pixels whose grey value equal k .
The iteration finishes when Ti +1 = Ti . According to the value when iteration finishes (Ti ) ,
get the segmentation threshold kTi , where the coefficient k can adjust the severe degree
of segmentation. When k is bigger, the foreground is smaller. To find the ROI by given
method, image partitioned into a number of blocks by a rectangle or square grid. Each
interior (fingerprint portion) block is more likely to contain more bright areas than the
blocks on the boundary and in the blank regions. As shown in Figure (3-2 (a)) a 2D
fingerprint image, where (b) shows the histogram of the gray-scale fingerprint image in
(a). Using the Otsu optimum threshold method [76], a threshold value should be found
for the image segmentation. Each pixel of the fingerprint image can be classified into one
of two classes: bright or dark. A pixel belongs to bright if its value is greater than the
threshold value; otherwise it belongs to the dark class. The thresholded image should be
partitioned into the union of disjoint blocks, squaring blocks. A percentage of white area
within each block is computed, its value should be compared with a threshold value. If it
48
is greater than the threshold, then all pixels in the block should be set to white. Otherwise,
black.
In the resulting image, the white region represents the region of interest (ROI), which is
shown in Figure (3-2 (c)). Overlaying (c) on (a), the region of the fingerprint image
produced for further processing. The result is shown in Figure (3-2 (d)). The obtained
49
result showing that segmentation of the original object from the background starts as
expected from the clear separation of modes in the histogram, and it was very effective
for all type of fingerprint images, i.e. poor, and good quality. Fingerprint image
segmentation based on grey level method is not so easily done in fingerprint images with
low contrast. For these cases, image enhancement techniques must be used first to
improve the visual appearance of the fingerprint image. Another major problem is the
setting of correct threshold value or automated threshold which will classify pixel as
object or background.
i+
w
j+
w 3-2
2 2
O x (i, j ) = ∑ ∑ 2G x (u, v )G y (u, v )
w w
u =i − u = j −
2 2
i+
w
j+
w 3-3
O y (i, j ) =
2
∑
2
∑ (Gx2 (u, v) − G y2 (u, v))
w w
u =i − u= j−
2 2
i+
w
j+
w 3-4
2 2
O E (i, j ) = ∑ ∑ (G x (u, v ) − G y (u, v ))2
w w
u =i − u= j−
2 2
50
3-5
O x2 (i, j ) + O y2 (i, j )
Coh =
O E (i, j ) ∗ w ∗ w
So, if the Coh is larger than a threshold, the block is considered as foreground, otherwise,
it belongs to background. The segmentation result of this method is shown in Figure (3-3
(b)). Both previous methods were chosen to segment the fingerprint object because they
can correctly segment the fingerprint images whose boundary is distinct. On the other
hand, they were sensitive to the quality of image, i.e. good investigators of low quality
fingerprint images. Grey level-based method gives an indication of wetness and dryness
of fingerprint images, while the direction-based method shows orientation contours of
ridge and valley structure, both indication results are very useful in validity estimation as
well as in quality benchmarking. Finally, Segmenting an image simplifies it, making it
easier to analyse and is therefore a key part of computer vision, image processing, and
security generation.
51
3.4 Fingerprint Pattern Analysis
It was defined in section 1, that fingerprints are the patterns on the inside and the tips of
fingers. The ridges of skin, also known as friction ridges, together with the valleys
between them form unique patterns on the fingers. Fingerprint pattern analysis from an
image anatomy processing point of view is a deconstruction of object patterns, e.g. ridge
and valley structure therein form one of a number of different fingerprint patterns used in
a fingerprint system. Fingerprint local structure constitutes the main texture-like pattern
of ridges and valleys within a local region. The local structure analysis of a ridge output
extraction, i.e. minutia, describes a rotation and translation invariant feature of that
minutia in its neighbourhood. A valid global structure puts the ridges and valleys into a
smooth flow for the entire fingerprint; it reliably determines the uniqueness of a
fingerprint. Both local and global structuring analyses determine the quality and validity
of a fingerprint image.
52
Figure 3-4 Examples of minutiae type
The localization of the minutiae in a fingerprint forms a valid and compact representation
of the fingerprint. The validity judgment of fingerprint image is dependent on the
following factors: image contrast, graphical representation of fingerprint elements, like
ridge and valley clarities and noise infection. The local information of fingerprint image
could be obtained by pixel values representation, where pixels indicate the light intensity
of the fingerprint image element as well as its grey value representation on grey value
map. It is useful for some fingerprint processing techniques, like threshold calculation,
segmentation based on grey level, enhancement based on pixel representation and
validity check based on enhancement percentages. As local analysis gives contrast
information of ridge and valley structure so the goodness, dryness and smudginess of the
whole fingerprint can be determined. In this case pixel is a good predictor of image
information, for example, the black pixels are dominant if a fingerprint is wet, the
average thickness of ridge is larger than one of valley, and vice versa on a dry fingerprint.
A severity of fingerprint image damage can be determined by statistical properties, i.e.
standard deviation and mean value in a local blocks division of fingerprint. A valid
fingerprint image tends to have a small deviation value for both ridge and valley.
53
3.4.2 Global Analysis
Global representation is an overall attribute of the finger and a single representation is
valid for the entire fingerprint and is typically determined by an examination of the entire
finger. The global structure is the overall pattern of the ridges and valleys. Fingerprint
images are very rich in information content. The main type of information in the
fingerprint image is the overall flow information, which is defined by the pattern of the
ridges and valleys in the fingerprint [33]. Fingerprint global structure provides
discriminatory information other than traditional widely used minutiae points. Fingerprint
global structure such as global ridge structure and singularities is used to dedicate the
fingerprint classification. It is beneficial to the alignment of the fingerprints which are
either incomplete or poor quality. The global structure analysis is used to certify the
localized texture pattern of the fingerprint images while ridge to valley structure is
analyzed to detect invalid images. Fingerprint images possess continuity and uniformity
as the general characteristic. Continuity is found along the orientation change while
uniformity is observed all over the image for its ridge and valley structure, they are
considered as a fingerprint global factor. Global uniformity and continuity ensures that
the image is valid as a whole. The commonly used global fingerprint structuring features
are:
Singular points – discontinuities in the orientation field. There are two types of
singular points. A core is the uppermost of the innermost curving ridge [77], and
a delta point is the junction point where three ridge flows meet. They are usually
used for fingerprint registration and fingerprint classification.
Ridge frequency map – the reciprocal of the ridge distance in the direction
perpendicular to local ridge orientation. It is formally defined in [33] and is
extensively utilized for contextual filtering of fingerprint images.
54
This representation is sensitive to the quality of the fingerprint images [6]. However, the
discriminative abilities of this representation are limited due to absence of singular points.
55
Dry Finger Light Print Moist Finger Dark Print Worm Ridge Structure
p1 pt
C1 :
ω1 (t ),... ωt (t ), and
57
pt +1 pt + 2 pL
C2 :
ω 2 (t ), ω 2 (t ),..., ω 2 (t ) , where
t
ω1(t ) = ∑ pi
i =1
3-6
and
L
ω 2(t ) = ∑ pi 3-7
i =t +1
also the means for classes C1 and C 2 are
t 3-8
μ1 = ∑ ipi ω (t )
1
i =1
and
L
ipi 3-9
μ2 = ∑ ω 2 (t )
i =t +1
The mean intensity for the whole image μ T will be
ω1μ1 + ω 2 μ 2 = μT 3-10
ω1 + ω 2 = 1 3-11
The between-class variance of the thresholded image was defined using discriminant
analysis [9].
σ B2 = ω1 (μ1 − μT )2 + ω 2 (μ 2 − μT )2 3-12
The optimal threshold (ot ) is chosen so that the between-class variance σ B2 is maximized:
ot = MAX σ B2 (t ) { } 3-13
The object region segment from the background morphologically is defined by:
if I ( x, y ) 〉 T , then I ( x, y ) ∈ object
else
The background is subtracted to work over pure segmented, threshold image and
binarized based on threshold level black and white image for the next block usage.
58
Dry Finger Light Print Moist Finger Dark Print Worm Ridge Structure
Dry Finger Light Print Moist Finger Dark Print Worm Ridge Structure
(b)
59
Poor Finger Placement None Object Structure
(b')
Figure 3-7 (a) Objects segmented areas, (b-b') object weighted areas
60
of image object, percentage of finger image]. The validity factors are selected between [0
and 100], 0 for no factor satisfaction, 100 for excellent presence of factor. For more
refined assessments of image validity IQS was passed to 15 subjects working in the field
of image processing and biometrics, since they are familiar with images and their
directions. The 15 scores of each image were averaged to a final validity MOS, Table 3-1.
61
Image Validity MOS NFIQ VCA
62
Correlation Relation
Validity MOS NFIQ VCA
Correlation results indicate that the proposed algorithm is feasible in detecting low
quality as well as non-fingerprint images.
3.7 Summary
In this chapter, a novel approach for image validity checks is presented, it is
computationally efficient, since no complicated processes are computed and it is using
system pre processing blocks such as segmentation and subtraction. Results show that the
proposed approach is competitive with the state of the art method NFIQ and it could be a
complementary factor in the image quality assessment process. Studying the
characteristics structure of other biometric objects such as IRIS, FACE, we could say that
implemented approach could be used. With the development of acquiring devices, and
combination of NFIQ or any image quality estimation method and the VCA algorithm,
acquiring devices such as scanners will enter into a new era - smart detection technology
and checking of capturing sources. The following summarized remarks are useful for
fingerprint image study:
4.1 Introduction
Fingerprint images are subject to a wide variety of distortions during acquisition, analysis,
processing, compression, storage, transmission and reproduction, any of which may cause
a degradation of its visual quality. The most fundamental problem of the error visibility
framework is image quality definition. In particular, it is not clear that error visibility
should be equated with image quality degradation, since some types of distortions may be
clearly visible but not perceptual. Images may be corrupted by sources of degradation,
which could be raised during acquisition, transmission, processing and reproduction [79].
To maintain, control, and enhance the quality of images, it is important for image life
cycle systems, e.g. acquisition, management, communication, and processing to be able
to identify and quantify image quality degradations [80]. The development of real-time
fingerprint image quality assessment can greatly improve the accuracy of fingerprint
image based systems, it is utilized to evaluate the system performance [23, 25, 81, 82],
assess enrolment acceptability [83], evaluate the performances of fingerprint sensors and
improve the quality of fingerprint databases [84]. The idea is to classify fingerprint
images based on their quality, where, it is desirable to assess the quality of a fingerprint
image in real time as a quality control procedure. This allows poor image acquisition to
be corrected through recapture and facilitates the capture of best possible image within
the capture time window configured in the system and image capture system will be
calibrated and controlled to satisfy the image quality parameters. Therefore, it is
appropriate to select minor or major image pre-processing techniques. The essential
factors for fingerprint image quality metrics are: captured image size, captured image
position and placement Figure (4-1(a)), image orientation Figure (4-1(b)), ridge clearness
Figure (4-2), matching features quantity Figure (4-3), and distortion of image Figure (4-4)
64
which is difficult to assess without actual matching. Good quality images require minor
pre-processing and enhancement, while bad quality should be rejected. Processing
parameters for dry images (low quality) and wet images (low quality) should be
automatically determined. These results can be improved by capturing more good quality
fingerprint images to increase the system identification accuracy and the integrity of
fingerprint database. In this chapter, we aim to develop a scheme which allows the
quantitative deterministic assessment of fingerprint image quality. The scheme assesses
the percentage or size of the given image that may contain an actual fingerprint and how
reliable the ridge flow could be detected from the located fingerprint area. This
assessment should agree as closely as possible with that pre obtained subjective analysis
test. It should be noted that exact correlation will never be achieved due to natural
variations in the subjective assessment. The most meaningful image quality measures are
based on visual assessment (subjective measurement). Subjective tests have shown that
the eye tends to concentrate upon those areas of a scene or image where there is a high
concentration of contours, i.e. fingerprint images ridges and valleys. In this chapter, we
will describe the construction of image quality measurements, the performance
evaluation of image quality assessment techniques. A new quality assessment technique
based on assurance quality of services will be proposed. As a conclusion comparison
results will be discussed.
(a) (b)
Figure 4-1 (a) Fingerprint image capturing position and placement , (b) Orientation field
65
Figure 4-2 Ridge clearness images
66
4.2 Image Quality Measures
Image quality assessment and comparison metrics play an important role in various
graphics orientated applications. They can be used to monitor image quality for further
processing systems, they can be employed to benchmark image processing algorithms,
and they can be embedded into the rendering algorithms to optimize their performances
and parameter settings. Fingerprint images may undergo distortions during preliminary
acquisition process, compression, restoration, communication or final database enrolment.
Hence image quality measurement plays a significant role in several image-processing
applications. Image quality, for scientific, forensic, and security purposes, can be defined
in terms of how well desired information can be extracted from the source image. An
image is said to have acceptable quality if it shows satisfactory usefulness, which means
discrimination of image content, extractability of its features, and satisfactory clearness,
which means identification of fingerprint image content, i.e. ridges, valleys. Image
quality metrics are important performance variables for digital imaging database systems,
and are used to measure the visual quality of processing images [85]. There are three
major types of quality measurements: subjective, objective and perceptual measurement.
In this thesis; it is investigated how to find the coefficient correlation between proposed
objective quality assessment algorithm and subjective opinion score. It is obvious to
know that measuring the quality of the fingerprint images is required. Hence, quality
measurement is one of the pre-processing stages of cryptography key generation models.
We can acquire a higher quality image by taking the quality of the image in the post-
processing stage of the authentication and matching process. Also, by rejecting a low
quality image and making user to input the correct fingerprint Figure (4-1(a)), we can
guarantee better image quality. The bifurcation and the ridges become unclear if there is
too much or less pressure on the finger in the input process. If the quality of fingerprints
is poor (bad), we can find out three cases: false minutiae finding, omission of minutiae,
and error occurrence in the position of minutiae. In order to solve these problems, the
enrolment stage must have a measure to select the good quality of fingerprint images.
Quality measurement is increasingly deployed in all biometric based systems to predict
the performance fact of given systems, i.e. evaluation criteria for assessing the
67
performance of image enhancement, feature extraction and matching with respect to the
quality indices.
σI 4-1
=K
I
Where I is the background luminance, and σI is the just noticeable incremental
luminance over the background by the HVS, and K is a constant called Weber function.
Weber’s law is maintained over a wide range of background luminance's and breaks only
at very little low or high light conditions. Light adaptation allows the HVS to encode the
contrast of the visual stimulus instead of light intensity. The fourth feature of HVS is
contrast masking where it is referring to the reduction of visibility of one image
component due to the presence of masker. Masker strength is measured by the variation
68
of signal visibility within presence or absence of masker. The HVS is enormously
complex with optical, synaptic, photochemical and electrical phenomena. HVS modelled
for objective representation while it is a core results of subjective assessment, i.e. a sight
basement of observers opinion score and it is reliability affecting the mean results score,
observers decisions are limited arranging in standard values defined by the International
Telecommunication Union (ITU), ITU suggest standard viewing conditions, criteria for
the selection of observers and test material, assessment procedures, and data analysis
methods. The ITU has recommended a 5-point scale using the adjectives bad, poor, fair,
good and excellent [87]. The ITU scale was used in all subjective based tests, as well as a
basic scale for MOS [19]. The MOS is generated by averaging the results of a set of
subjective tests, where a number of subjects are asked to watch the test images and to rate
their quality. Subjective tests may measure impairment scores as well as quality scores;
or they can be asked to rate the degree of distortion, the amount of defects or the strength
of artefacts. Subjective quality measurement techniques provide numerical values that
quantify viewer’s satisfaction; however, subjective experiments require careful setup and
are time consuming because observers response may vary, hence expensive and often
impractical. Furthermore, for many applications such as online quality monitoring and
control subjective experiments cannot be used at all. They provide no constructive
methods for performance improvement and are difficult to use as a part of design process.
It is used to predict the successfulness of objective proposing methods within correlated
relation.
69
eye characteristics, e.g. CSF, light adaptation, and masking [88]. The perceptual metrics
can provide a more consistent estimation of image quality than objective metrics when
artefacts are near the visual threshold. Image discrimination models used in perceptual
quality assessment, however, have been developed for measuring general quality
degradation introduced by compression processes. The implementation of these metrics is
also often complex, and time-consuming subjective psychophysical testing is required for
validation [89]. While task-based model observers have been designed to predict human
visual detection of signals embedded in noisy backgrounds, the effect of quality
degradations on the performance of detecting analysis features for fingerprint image
requires further investigation. This kind of measurement could be used in refining
fingerprint images for the purpose of updating database sources. Fingerprint image
database refining is based on image fidelity, which is the subset of overall image quality
that specifically addresses the visual equivalence of two images. It is used to determine
the difference between two images that are visible to the human visual system. Usually
one of the images is the reference which is considered to be original, perfect or
uncorrupted. The second image has been modified or distorted in some sense. It is very
difficult to evaluate the quality of an image without a reference. Thus, a more appropriate
term would be image fidelity or integrity, or alternatively, image distortion. In addition
to the tow digital images, an image fidelity based on perceptual metric requires a few
other parameters, e.g. viewing distance, image size, display parameters. The output is a
number that represents the probability that a human eye can detect a difference in the two
images or a number that quantifies the perceptual dissimilarity between the two images.
Alternatively, the output of an image perceptual metric could be a map of detection
probabilities or perceptual dissimilarity values. The most common stages that are
included in the perceptual model are:
Registration, i.e. the point by point correspondence between two images, is necessary for
any quality metric to make any sense. Otherwise, the value of a metric could be
70
arbitrarily modified by shifting one of the images. The shift does not change the images
but changes the value of the metric.
Display model, i.e. an accurate model of the display device is an essential part of any
image quality metric, as the HVS can only see what the display can reproduce. Display
model effects are incorporated in the perceptual model, therefore, when the display
changes, a new set of the perceptual model must be obtained.
1 I 4-2
MSE = ∑ ( Ri − Di ) 2
I i =1
2
RM 4-3
PSNR = 10 ⋅ log10
MSE
where I is the total number of pixels in the image and RM is the maximum possible
reference intensity value. The analysis depends on the number of images used in the
measurement and the nature or type of measurement using the pixel elements of digitized
images. However, these simple measures operate solely on the basis of pixel-wise
differences and neglect the important influence of region of interest image content and
viewing conditions on the actual visibility of artefacts. Therefore, they cannot be
71
expected to be reliable predictors of perceived quality. Metrics have been defined either
in the spatial or frequency domain. These measurement techniques are easy to calculate,
however they do not consider human visual sensitivities. They do not adequately predict
distortion visibility and visual quality for images with large luminance variations or with
varying content. An objective quality assessment classified into graphical and numerical
classes, histogram criteria is an example of the graphical class and MSE is a numerical
example. It is believed that a combination of numerical and graphical measures may
prove useful in judging image quality. Objective image quality metrics can be classified
according to the availability of an original (distortion-free) image, with which the
distorted image is to be compared. Most existing approaches are known as full-reference,
meaning that a complete reference image is assumed to be known. In many practical
applications, however, the reference image is not available, and a no-reference or “blind”
quality assessment approach is desirable. In a third type of methods, the reference image
is only partially available, in the form of a set of extracted features made available as side
information to help evaluate the quality of the distorted image. This is referred to as
reduced-reference quality assessment. This thesis focuses on non-reference image quality
assessment for the sake of automatic acceptance and rejection of target fingerprint image.
Thus, the term quality assessment is not used here to refer to the fidelity of the tested
sample, but instead to the utility of the sample to an automated system. It is a difficult
task to objectively weight the clearness of fingerprint ridges, low noise, and image good
contrast. A blind quality assessment is a good indicator for validity check while validity
benchmark is good quality estimator and vice versa [91]. Both validity and quality
estimators are used to be matching performance predictive of biometric systems. The
main goal of objective quality assessment is to design algorithms whose quality
prediction is in good agreement with subjective scores from human observers. There are
different attributes that characterize an objective quality approach in terms of its
prediction performance with respect to MOS [56]. The most important one is its accuracy.
Where accuracy is the ability of a metric to predict subjective ratings with minimum
average error and can be determined by means of the Pearson linear correlation
coefficient (PLCC). For a set of D data pairs ( xi , yi ) , it is defined as follow
72
∑ i =1 (xi − x )(yi − y ) 4-4
N
Pearson =
∑ i =1 (xi − x ) ∑ i =1 (yi − y )
N 2 N 2
where x and y are the means of the respectively objective and subjective data.
This assumes a linear relation between the data sets, which may not be the case.
Therefore, in this thesis correlation will be used to obtain relative comparisons between
subjective and objective data, as well as to investigate the performance of the proposed
objective metrics. The objective metrics developed in this thesis will be used in different
stages of bio crypto image processing based and analysis systems as monitoring,
optimization, and benchmarking, and will be compared to the state of the art objective
metrics currently in use.
73
methods are in most cases easy to apply, but only in a few cases can their results be
generalised. Fingerprint image quality assessment is a difficult yet very important task in
the evaluation of any fingerprint imaging applications. Fingerprint image quality affects
the performance and interoperability of fingerprint based application, e.g. identification,
authentication, and built on based crypto systems. The basic premises of fingerprint
image quality assessment are based on extractable information as a task of quality
assessment information, e.g. ridges and valleys, how this information will be extracted,
how it will be correlated to the extracting observation and finally the statistical analysis
between image and object according to the noise measurement. Blind quality assessment
is desirable in finger crypto key generation, where the reference image is unavailable and
assessment will be taken according to the available image information and or extraction
based features availability.
74
for database renewing where it requires the entire reference content to be available,
usually in uncompressed form, which is quite an important restriction on the usability of
such metrics. In general, full reference assessment considers structural similarity (SSIM)
and peak signal to noise ratio (PSNR) as image quality assessors.
Both the SSIM and PSNR are related to the human visual system, noting that people
evaluate image quality based on structural information rather than pixel intensities
themselves. The principle idea underlying the structural similarity approach is that the
HVS is highly adapted to extract structural information from visual scenes, and therefore,
a measurement of structural similarity (or distortion) should provide a good
approximation to perceptual image quality. A full reference method can be used also in
evaluation and comparative study of fingerprint image quality estimation and
benchmarking approaches.
75
Figure 4-6 Block diagram of conventional reduced reference image quality methods.
At the sender side, a feature extraction process is applied to the original image, and then
the extracted features are transmitted to the receiver as side information through an
ancillary channel. Although it is usually assumed that the ancillary channel is error free.
Another choice is to send the RR features in the same channel as the image being
transmitted. In that case stronger protection of the RR features relative to the image data
is usually needed. When the distorted image is transmitted to the receiver through a
regular communication channel with distortions, feature extraction is also applied at the
receiver side. This could be exactly the same process as in the sender side, but it might be
adjusted according to the side information, which is available at the receiver side. In the
final stage of RR quality assessment method, the features that were extracted from both
the reference and distorted images are employed to yield a scalar quality score that
describe the quality of the distorted image. RR features should provide efficient summary
of the reference image, they should be sensitive to a variety of image distortion and they
should have good perceptual relevance. In most cases, RR features are simply a set of
randomly selected image pixels. When these pixels are transmitted to the receiver, they
are compared with corresponding pixels in the distorted image. The MSE or PSNR value
between the reference and distorted images can then be estimated. This method is used
for assessing the quality of fingerprint images by extracting informative features.
Fronthaler, et al. [94] exploited the orientation tensor which holds edge and texture
information to assess the quality of fingerprint images. Their method decomposes the
76
orientation tensor of an image into symmetry representations, where the included
symmetries are related to the particular definition of quality and encode a priori content-
knowledge about the application (e.g. fingerprints).
77
Methods that rely on image features either global or local are tied on structural
representation [23, 29, 97, 98] and they are divided into three types: Power Spectrum,
Orientation flow, and Gabor feature based, while second class was proposed in [36] as
intelligent neural network fingerprint image quality estimator, these types of methods rely
on computing a feature vector using the quality image “map” and minutiae quality
statistics produced by the minutiae detection algorithm. The feature vector is then used as
inputs to a Multi-Layer Perceptron (MLP) neural network classifier, and the output
activation level of the neural network is used to determine the fingerprint’s image quality
value.
78
Nik_index1.tif 1_ben_index2.tif 7_nik_auricu1.tif
Figure 4-8 Good Fingerprint Images
For instance, false extracted features may appear due to previous poor and invalid
fingerprint factors. Quality assurance of stored template at the enrollment of automatic
recognition system, quality ware fusion, and image region variety check for enhancement
guidance reasons are the benefits of blind fingerprint image quality assessment[94].
Therefore, it is desirable to assess the quality of image to improve the overall
performance of biometric systems, but it is a difficult task due to blind automatic
prediction of perceived image quality. We used Gabor and Fourier power spectrum
methods to get a novel Gabor spectrum (GS) approach [19], and quantitatively compare
the implantation results of GS with respect to an existing two classes, as well as,
manually human image quality survey (IQS) which assigned quality estimation values on
the TIMA database [22].
79
4.4.1Gabor Features
The characteristics of Gabor filter, especially the frequency and orientation
representations, are correlated with perceptual image and similar to those of the human
visual system. Therefore, Gabor features were used for computation of foreground-
background segmentation, degree of smudginess and dryness of fingerprint images. The
2D Gabor function is represented as a Gaussian function modulated by a complex
sinusoidal signal and it is adopted for feature extraction, Equation (2-6). Since most local
ridge structures of fingerprints can be modelled as oriented sinusoids along a direction
normal to the local ridge orientation [97], the Gabor parameters are set to the following
values: frequency of the sinusoidal plane wave; f = 0.125 (corresponds to inter-ridge
distance of 8), standard deviations of the Gaussian envelope along x, y axes;
{ }
σ x = σ y = 4 , and θ = 0 o ,22.5o ,45o ,67.5o ,90 o ,112.5o ,135o ,157.5o , resulting in eight
Gabor filter, Figures (4-10), (4-11). These values were set to be used for databases
quality analyses. Gabor feature extraction is performed by convolving the input image
with the set of Gabor filters, Equation (4-5). It is used to determine the quality of
fingerprint images [23, 98]. An image is divided into blocks of size w centred at (X, Y ) ,
the magnitude Gabor feature at that sampling point can be defined as follows:
(w / 2 ) − 1 (w / 2 ) − 1
( )
g X , Y ,θ k , f , σ x , σ y = ∑ ∑ ( )
I ( X + x, Y + y )h x, y, θ k , f , σ x , σ y , 4-5
x = −w / 2 y = −w / 2
where k = 1,...m , I ( x, y ) denotes the gray-level value of the pixel ( x, y ) , and w is the
size of the blocks in divided image. m Gabor matrices are obtained according to the
Gabor parameters set. Then, each block is sampled by these matrices and m Gabor
features are obtained. A (w × w) block is then compressed to m meaningful Gabor
features. Finally, the standard deviation value of each block is computed by Equation (2-
7).
80
Figure 4-10 Gabor features of (Nik_index1.tif) fingerprint images
The standard deviation value is used for both image quality estimation and fingerprint
image segmentation. Fingerprint area (foreground) is segmented depending on the
standard deviation value, if it is less than the block threshold value, the block is counted
as background; otherwise the block is counted as a foreground block. The quality field
for a foreground block is counted to be good quality if it is more than the preset quality
threshold; otherwise it is counted as a bad block. The total quality of the fingerprint
image is calculated according to the quantities of foreground blocks, Equation (2-8). In
this case, the fingerprint image is counted as good quality if total quality value is bigger
than a pre determined threshold; otherwise it’s counted as a poor quality image.
81
4.4.2 Gabor Spectral Method
Gabor spectral method (GSM) used benefit of both Gabor and Fourier power spectrum
methods, such as frequency and orientation representations. GSM is based on spectrum
analysis of Gabor features banks within orientations: [0: pi/8: 7pi/8] for all non overlap
fingerprint image blocks. Method flowchart is shown in Figure (4-12). GSM performs
fingerprint image resizing into (256x256) for the reason of using same size images, then a
(32x32) non-overlapping block is used to find block quality estimation, after that GSM
procedure is started with calculating the spectrum of Gabor feature banks within given
orientations, section (4.4.3). The standard deviation of calculated feature is used to
determine the quality of under process block, and then the total quality of image is
calculated by averaging blocks determined quality. Final quality will be measured in
range [0, 1] after applying a chosen normalization factor on averaged determined quality.
82
4.4.3 GSM Mathematical Background Analysis
An even value of Gabor function represents the characteristics of function filtration for
frequency and orientation similarities of the human visual system [99]. The Gabor
features spectrum will be found by Fourier Transform of 2-d even symmetric Gabor
function, where the 2-d even symmetric Gabor function is defined as:
2 2
1 x' y'
− { 2 + 2}
2 σ ' σ ' 4-6
⋅ cos(2πfx ' )
x y
g ( x, y ) = e
where x ' = (x cosθ + y sin θ), y' = (−x sin θ + y cosθ) are rotated coordinates,
2 2
1 x' y'
− { 2 + 2}
2 σ ' σ '
⋅ cos(2πf ( x cos θ + y sin θ ))
x y
Thus g ( x, y ) = e
2 2
1 x' y'
− { 2 + 2}
2 σ ' σ '
⋅ cos(2πfx cos θ + 2πfy sin θ )),
x y
=e
2 2
1 x' y'
− { 2 + 2}
2 σ ' σ '
⋅ cos(θ1 + θ 2 ), Using trigonometric identity (Euler’s formula)
x y
g ( x, y ) = e
cos(θ1 + θ 2 ) = cos θ1 cos θ 2 − sin θ1 sin θ2 , Then
1 x '2 y ' 2
− { 2 + 2 }
2 σ ' σ '
g ( x, y ) = e x y
⋅ (cos θ1 cos θ 2 − sin θ1 sin θ 2 )
Thus
83
The Fourier Transform of Equation (4-7), 2-d symmetrical Gabor function is
∞ ∞ 4-8
F ( g ( x, y ))(u , v) = ∫ ∫ g ( x, y ) ⋅ e − 2 jπ (ux + vy ) dxdy
−∞ −∞
1 x'2 y'2
− { 2 + 2 }
∞ ∞ 2 σ ' σ '
F (u , v ) = ∫ ⋅ cos θ1 cos θ 2 ⋅ e − 2 jπ(ux + vy ) dxdy
− ∞ ∫− ∞
x y
e
2 2
1 x' y'
− { 2 + 2 }
∞ ∞ 2 σ ' σ '
−∫ ⋅ sin θ1 sin θ 2 ⋅ e − 2 jπ(ux + vy ) dxdy
− ∞ ∫− ∞
x y
e
1 jθ 1
where cos(θ ) = (e + e − jθ ) , sin(θ ) = (e jθ − e − jθ ) , thus
2 2j
1 x'2 y'2
− { 2+ 2}
∞ ∞ 2 σ' σ' 1 1
F (u, v) = ∫ ⋅ [e jθ1 + e − jθ1 ] ⋅ [e jθ2 + e − jθ2 ] ⋅ e −2 jπ(ux+vy) dxdy
−∞ ∫−∞
x y
e
2 2
1 x'2 y'2
− { 2+ 2}
∞ ∞ 2 σ' σ' 1 jθ1 1
−∫ [e − e − jθ1 ] ⋅ [e jθ2 − e − jθ2 ] ⋅ e −2 jπ(ux+vy) dxdy
−∞ ∫−∞
e x y
⋅
2j 2j
1 x'2 y'2
− { 2+ 2}
1 ∞ ∞ 2 σ' σ'
⋅ [e j (θ1 +θ2 ) + e j (−θ1 +θ2 ) + e j (θ1 −θ2 ) + e j (−θ1−θ2 ) ] ⋅ e −2 jπ(ux+vy) dxdy
4 ∫−∞ ∫−∞
= e x y
1 x'2 y'2
− { 2+ 2}
1 ∞ ∞ 2 σ' σ'
− (− )∫ ∫ e x y
⋅ [e j (θ1+θ2 ) − e j (−θ1+θ2 ) − e j (θ1−θ2 ) + e j (−θ1 −θ2 ) ] ⋅ e −2 jπ(ux+vy) dxdy
4 −∞ −∞
1 x2 y2
1 ∞ ∞ − 2{σ x2 + σ y2 }
= ∫ ∫ e ⋅ [e j (θ1 +θ 2 ) + e j ( −θ1 −θ 2 ) ] ⋅ e − 2 jπ ( ux + vy ) dxdy
2 − ∞ − ∞
84
1 x2 y2 4-9
− { 2 + 2}
1 ∞ ∞ 2 σx σy
⋅ e j (θ1 + θ 2 ) ⋅ e − j 2π(ux + vy ) dxdy
2 ∫− ∞ ∫− ∞
= e
1 x2 y2
− { 2 + 2}
1 ∞ ∞ 2 σx σy
⋅ e j ( −θ1 − θ 2 ) ⋅ e − 2 jπ(ux + vy ) dxdy
2 ∫− ∞ ∫− ∞
+ e
1 x2 y2
− { 2 + 2}
∞ ∞ 2 σx σy
∫−∞ ∫−∞ e ⋅ e j (θ1 +θ 2 ) ⋅ e− j 2π (ux + vy ) dxdy
x2 y2
− −
∞ ∞ 2 2σ y2 2 jπfy sin θ − 2 jπvy
=∫ ∫ e 2σ x ⋅ e2 jπfx cos θ ⋅ e− 2 jπux ⋅ e ⋅e ⋅e dxdy
−∞ −∞
1 1
− [ x 2 + j ( 4πσ x2ux −4πσ x2 fx cos θ )] − [ y 2 + j ( 4πσ y2vy −4πσ y2 fy sin θ )]
∞ 2σ x2 ∞ 2σ y2
=∫ e dx ⋅ ∫ e dy
−∞ −∞
1
− [ x + 2 jπσ x2 (u − f cos θ )] 2
∞ 2σ x2 2 2 2
=∫ e dx ⋅ e − 2π σ x (u − f cos θ )
−∞
1
− [ y + 2 jπσ y2 (v − f sin θ )] 2
∞ 2σ 2
− 2π 2σ y2 (v − f sin θ ) 2
⋅∫ e y
dy ⋅ e
−∞
∞ 2 2 2 ∞ − 2π 2σ y2 (v − f sin θ ) 2
= 2σ x ⋅ ∫ e − t1 dt1 ⋅ e − 2π σ x (u − f cos θ ) ⋅ 2σ y ⋅ ∫ e − t 2 dt 2 ⋅ e
2 2
−∞ −∞
1 1
where t1 = [ x + 2 jπσ x2 (u − f cos θ )] and t 2 = [ y + 2 jπσ 2y ( v − f sin θ)]
2σ x 2σ y
2 2 2 − 2π 2σ y2 (v − f sin θ ) 2
= 2σ x ⋅ π ⋅ e − 2π σ x (u − f cos θ ) ⋅ 2σ y ⋅ π ⋅ e
1 (u − f cos θ ) 2 (v − f sin θ ) 2
− [ + ]
2 σ u2 σ v2
= 2πσ xσ y ⋅ e
85
1 1
where σ u = and σ v = .
2πσ x 2πσ y
Similarly,
1 x 2 y2 1 ( u + f cos θ) 2 ( v + f sin θ) 2
− { 2 + 2} − [ + ]
∞ ∞ 2 σx σy 2 σ 2u σ 2v
∫−∞ ∫−∞ e ⋅ e j( −θ1 −θ2 ) ⋅ e − 2 jπ( ux + vy) dxdy = 2πσ x σ y ⋅ e
1 1
where A = πσ xσ y , σ u = ,σ v = .
2πσ x 2πσ y
2
⎛ − 1[ ( u −f cos θ)2 + ( v −f sin θ)2 ] 1 ( u + f cos θ) 2 ( v + f sin θ) 2 ⎞
⎜ 2 σu2
σv2
− [
2 σ 2u
+
σ 2V
]⎟
PS = A ⋅ ⎜ e +e ⎟
⎜ ⎟ 4-11
⎝ ⎠
86
4.5 Experimental analysis
The experimental study was split into two classes of tests: 1) Objective (approaches
based), e.g. power spectrum approach (PS), Directional contrast approach (DC), Gabor
feature approach (GF), Neural network approach (NN) and Gabor spectrum method
(GSM). 2) Subjective (human observers based) which taken as quality assessment
reference standard. Correlation, scatting and reliability results will be the performance
evaluation of all approaches. Matlab was chosen as the implementation platform for all
approaches except neural network method of NFIQ, where NFIQ was introduced as an
independent quality estimator that is intensely trained to forecast matching performance
[35] and is publicly available as a package of NIST Fingerprint Image Quality, NIST
Fingerprint Image Software 2, [37, 100]. The approaches were tested on 135 different
combinations of fingerprint images, i.e. 90 good images, 45 faulty images, TIMA
database [22]. Images were cropped from centre into 256x256 sizes, JPG format and
converted to WSQ format for NFIQ test.
4.5.1Subjective Test
Subjective evaluation is still a method commonly used in measuring image quality. It is
used to quantify image changing, degrading and quality is by asking specialist viewers to
subjectively evaluate image quality. A set of fingerprint images [TIMA database] were
viewed by 15 human observers working in image processing and biometrics fields using
web based image quality survey (IQS) Figure (4-13). TIMA database was used because it
contained a different quality degree of fingerprint images, e.g. bad, good, cut, no contact.
87
Figure 4-13 Image quality survey
IQS was subjectively rated by human observer participants. IQS was done based on
visual assessment (subjective measurement), it was conducted on different image quality,
and validity taken from previous database, the validity factors were taken as image
contrast, ridge clarity, valley clarity, image noise, and image content quality [informative
of image object, percentage of finger image], the validity factors are selected between [0
and 100], 0 for none factor satisfaction, 100 for excellent presence of factor. The scores
of each image were averaged to a final validity and quality MOS Equation (4-12). Table
(4-1) shows a partial data of studied database within investigated estimator's i.e. PS, DC,
GF, NN, and GSM with reference to MOS.
1 N 4-12
MOS = ∑ scorei
N i =1
where N=15
88
Image MOS PS DC GF NN GSM
4_nik_index1.tif 0.3 0.34858 0.34813 0.3561 0.28 0.2247
7_nik_index12.tif 0.6 1.0271 0.37602 0.55714 0.55 0.895
no_contact_pb3.tif 0.124 0.42663 0.34275 0.38329 0.33 0.2247
4_nik_index3.tif 0.24 0.34536 0.34203 0.43132 0.36 0.2247
7_nik_index3.tif 0.68 0.80369 0.37297 0.5087 0.42 0.717
no_contact_pb5.tif 0.18 0.237 0.10766 0.22726 0.33 0.2247
4_nik_majeur1.tif 0.35 0.29413 0.33335 0.49927 0.43 0.2247
7_nik_majeur11.tif 0.59 0.6765 0.3697 0.72013 0.49 0.654
shift5.tif 0.27 0.37306 0.3716 0.22877 0.4 0.2247
1_ben_index2.tif 0.63 1.1697 0.3973 0.78126 0.51 0.895
4_nik_majeur3.tif 0.27 0.3264 0.31765 0.4949 0.41 0.2247
7_nik_pouce4.tif 0.51 0.8582 0.38976 0.7603 0.39 0.714
shift9.tif 0.54 0.72729 0.28156 0.90739 0.51 0.895
2_ben_for_ben.tif 0.5 0.2963 0.29555 0.44655 0.42 0.2247
4_nik_majeur4.tif 0.34 0.27675 0.28525 0.56322 0.39 0.2247
Strange13.tif 0.15 0.43839 0.40161 0.46243 0.3 0.2247
3_ben_index1.tif 0.36 0.39664 0.38335 0.37187 0.32 0.2247
4_nik_majeur5.tif 0.27 0.33683 0.33639 0.38928 0.38 0.2247
cut1.tif 0.38 0.34014 0.35812 0.50606 0.32 0.2247
Strange14.tif 0.14 0.25069 0.24745 0.59567 0.48 0.2247
3_ben_majeur6.tif 0.52 0.35877 0.38989 0.67835 0.42 0.2247
4_nik_pouce1.tif 0.27 0.29945 0.33075 0.54178 0.32 0.2247
cut3.tif 0.17 0.45529 0.41772 0.30789 0.27 0.2247
Strange6.tif 0.2 0.39757 0.39373 0.54719 0.4 0.2247
3_gui_index1.tif 0.3 0.39965 0.37967 0.29201 0.36 0.2247
4_nik_pouce3.tif 0.25 0.33416 0.33839 0.36871 0.27 0.2247
cut6.tif 0.16 1.0498 0.38311 0.39633 0.33 0.895
3_mar_index3.tif 0.31 0.29204 0.30181 0.57446 0.49 0.2247
4_nik_pouce4.tif 0.3 0.28912 0.28912 0.58932 0.38 0.2247
3_nik_index1.tif 0.35 0.31636 0.31692 0.38066 0.38 0.2247
7_nik_auricu1.tif 0.63 0.80909 0.34182 0.98746 0.42 0.895
3_nik_pouce_1.tif 0.31 0.37077 0.35334 0.38553 0.36 0.2247
7_nik_index1.tif 0.68 0.83553 0.37605 0.61696 0.4 0.667
nik_annu_g_9.tif 0.51 0.77328 0.39711 0.50023 0.42 0.614
Table 4-1 Part of "MOS-IQS, PS, DC, GF and NN- NFIQ quality results",
89
4.5.2 Accuracy and Correlation Analysis
Correlation coefficient indicates the strength and direction of a linear relationship
between objectives estimators and subjective MOS, how strong this relationship, and
whether the correlation is positive or negative. For image quality estimators the Pearson
correlation coefficient is chosen, because it is the best estimate of the correlation of two
series. Correlation coefficient is used after drawing a scatter plot of the data that suggests
a linear relationship. Scatter plots are illustrated in Figures (4(14-18)), and the correlation
coefficients in Table (4-2). Each point in scatter graphs represents one test image, with its
vertical and horizontal coordinates representing its subjective MOS and the model
prediction, i.e. quality estimators, respectively. It is clear that, the GSM results exhibit
better consistency with the subjective data than the results of the other estimators. Table
(4-2) shows the numerical evaluation results, where GSM is the more accurate and
highest correlated order to the MOS. The GSM used benefit of spectrum analysis on
Gabor feature detection to enhance and construct image blind quality assessment, and to
be used as a function of monitoring and controlling of image enrolment for the sake of
increasing the efficiency of whole dependent system, i.e. verification, identification and
crypto key generation systems. The prediction monotonicity can be measured with the
Spearman rank order correlation coefficient (SROCC) [101]. This measures correlation
between the objective measure’s rank orders of the subjective scores (MOS). The
SROCC is described by the following equation:
(
6 ∑ D2 )
ρ = 1−
( )
N N 2 −1 4-13
where D refers to the difference between subjects ranks on the two variables, and N is
the number of data points. The results of estimators ranking and monotonicity is shown in
Table (4-3), where the greatest monotonicity and accuracy is still found using GSM.
90
Figure 4-14: Scatter plot of PS vs. MOS with Pearson correlation: 0.7822
Figure 4-15: Scatter plot of DC vs. MOS with Pearson correlation: 0.7641
91
Figure 4-16: Scatter plot of GF vs. MOS with Pearson correlation: 0.8231
Figure 4-17: Scatter plot of NN vs. MOS with Pearson correlation: 0.8009
92
Figure 4-18: Scatter plot of GSM vs. MOS with Pearson correlation: 0.8811
MOS PS DC GF NN GSM
MOS 1 0.7822 0.7641 0.8231 0.8009 0.8811
PS 0.7822 1
DC 0.7641 1
GF 0.8231 1
NN 0.8009 1
GSM 0.8811 1
MOS PS DC GF NN GSM
MOS 1 0.7146 0.7336 0.7865 0.7927 0.8326
PS 0.7146 1
DC 0.7336 1
GF 0.7865 1
NN 0.7927 1
GSM 0.8326 1
93
investigated estimator. Result of test shows that GSM is high reliable among evaluated
estimators, Table (4-4), Figure (4-19).
PS DC GF NN GSM
FR 0.04 0.1 0.06 0.05 0.02
TR 0.96 0.9 0.94 0.95 0.98
0.99 GSM
0.98
0.97 PS
0.96 NN
0.95 GF
TR
0.94
0.93
0.92
0.91
0.9 DC
0.89
0 0.02 0.04 0.06 0.08 0.1 0.12
FR
Figure 4-19: False rate (FR) versus True rate TR of image quality assessment
approaches
94
conventional methods using TIMA DB. Figure (4-20) shows the matching results with
the Receiver Operating Curves (ROC) in order to compare the proposed algorithm with
existing algorithms. From this experiment, it is observed that performance of the
fingerprint verification system was significantly improved when GSM quality estimation
algorithm was applied to the input fingerprint images.
4.6 Summary
In this chapter, a heuristic non reference image quality assessment and validity check
approach based on Gabor feature spectrum analysis is presented. Different image quality
approaches were evaluated for the sake of quality assessment competition, the proposed
approach competes well with the other investigated methods. It was behaving closest to
human opinion on fingerprint validity and quality analysis, which comes out as excellent
in comparison to studied approaches. Proposed approach can effectively guide the
template selection at the enrollment stage and fingerprint image quality classification for
automatic parameter selection in fingerprint image pre-processing.
95
Chapter 5 Fingerprint Crypto Key
Structure
5.1 Introduction
With the rapid diffusion of information technology (IT) and its outputs, biometrics-based
security systems are widely used in access control to computing resources, bank accounts
in ATM systems, computer security, and user validation in e-business[11]. Biometric
techniques as a core of biometric security systems have much superiority compared to
traditional methods (token or knowledge based schemes) such as increasing user
convenience and robustness against impostor users, but they are vulnerable to attacks
from a template production to the storage database through transmission media. Thus, the
possibility that a biometric database is compromised is one of the main concerns in
implementing secure biometric systems, protecting biometric template during its journey
from enrolment to the matching stages. It is difficult to control and trace hacking and
cracking bio systems by unauthorized people. Cryptographic techniques are being used
for information secrecy and authentication insurance in computer based security systems
[103]. Many cryptographic algorithms are available for securing information. For all
traditional algorithms the security is dependent on the secrecy of the secret or private key
when a user deploys a symmetric or a public key system, respectively. The user chooses
an easily remembered password that is used to encrypt the cryptographic key and this key
is then stored in a database. In order to retrieve the key back, the user enters the password
which will then be used to decrypt the key. In such systems, security of the cryptographic
key is weak due to practical problems of remembering various pass-codes or writing
them down to avoid data loss. Additionally, since the password is not directly tied to a
user, the system is unable to differentiate between the legitimate user and the attacker.
The limitations of password systems can be alleviated by stronger user tied password
such as biometrics. Bio-Crypt technology is a result of merging two important aspects of
96
digital security environment, biometrics and cryptography. There are various methods
that can be deployed to secure a key with a biometric. First one involves remote template
matching and key storage, i.e. key release. In this method, the biometric image is
captured and compared with a corresponding template. If the user is verified, the key is
released. The main problem here is using an insecure storage media [12, 15]. Second
method hides the cryptographic key within the enrolment template itself via a secret bit-
replacement algorithm, i.e. key binding. When the user is successfully authenticated, this
algorithm extracts the key bits from the appropriate locations and releases the key. The
drawback of this scheme is that the key will be retrieved from the same location in a
template each time a different user is authenticated [12, 15]. Using data derived directly
from a biometric image is another method. In this manner biometric templates are used as
a cryptographic key. But sensitivities due to environmental and physiological factors and
compromising of the cryptographic keys stand as a big obstacle [13, 39, 104]. Due to the
biometric variability, we will study the possibility of consistently extract and generate a
relatively small number of bits from biometric template to serve as a key or to bind a
secret key within template itself. In this chapter, we will study a biometric crypto key
structure, its vulnerabilities and problematic of implementation possibilities scenarios.
5.2.1Fingerprint Acquirement
The acquirement stage is based on scanning, capturing and features registration
processing to detect a high resolution image with enough gray levels, in most cases 256
levels (8 bits/pixel). Further image enhancement process or direct enrolment within
validity and quality assessment (see Chapters 2 and 3) could affect the performance
evaluation of built based system. Therefore, we consider the following steps: image
enhancement, thresholding, ridge thinning and minutiae extraction Figure (5-2), to get the
fingerprint features, the final results is a set of minutiae points with their characteristic
information, i.e. position, gradient and type [77, 105, 106].
The most commonly employed method of minutiae extraction is the crossing number
(CN). The minutiae are extracted by scanning the local neighborhood of each ridge pixel
in the image using a 3 x 3 window. The CN value is then computed, which is defined as
half the sum of the differences between pairs of adjacent pixels in the eight-
neighborhoods. Then according to the computed CN, the ridge pixel can be classified as a
ridge ending, bifurcation or non minutia point. If CN value equal to one then that pixel is
classified as ridge ending point, and if it is equal to three then it is classified as
bifurcation. Otherwise it is considered as non minutiae point. In the crossing number
conditions and their corresponding properties are shown in Table (5- 1), and Figure (5-3).
98
Figure 5-2 Block diagram for minutiae based feature extraction
8 5-1
CN = 0.5∑| Pi − Pi+1 |, P9 = P1
i=1
where Pi is the pixel value in the neighbourhood of P. The eight neighbouring pixels of
the pixel P are scanned in an anti-clockwise direction.
This means that a pixel is classified according to the value of projected 3x3 window. If it
has only one neighbouring ridge pixel so it is classified as ridge ending, and classified as
bifurcation if it has three separated pixels that are only connected to the centre pixel of
the projected window.
99
CN Property
0 Isolated point
1 Ridge ending point
2 Continuing ridge point
3 Bifurcation point
4 Crossing point
P4 P3 P2
P5 P P1
P6 P7 P8
extracted minutiae points are stored. Let us call the minutiae μ ij , where i is the
descriptor of the fingerprint image (i = 1,...,5) , and j is the number of minutiae ni found
in the appropriate fingerprint ( j = 1,..., ni ) . Each minutia (only ridge ending and ridge
bifurcation are considered) has three items (position, gradient, and type) Equation (5-2).
(
μ i j = x i j , y ij , φ i j , t i j ) 5-2
100
Another important factor is the position of the core point or reference point of the
fingerprint. The core point position should be determined before assembling an
appropriate set of minutiae. The centre point of the fingerprint must not be affected by
any translation or rotation of the fingerprint image. Fingerprint centre is computed based
on minutiae gravity centre or ridge count or orientation field. Minutiae gravity center
computation is based on the position of all the minutiae μ ij , more precisely on their x
2 2 5-3
d= x1 − x 2 + y1 − y 2
The Euclidean distance expression is extended for the minutiae set μ ij calculation,
1 2 2 5-4
d ij =
ni − 1
∑ xij − xik + yij − yik
The minimum distance value of all minutiae set in each fingerprint can be computed as:
(
δ i = min d i1 ,..., d in1 ) 5-5
The minutia with the minimal distance δ i has the same coordinates as the centre of the
fingerprint with coordinates [C X , C Y ] . The centre could vary, if the image acquisition
device provides images which contain a lot of information noise in the image data. These
problems can be solved by eliminating the image if it's quality under validity and quality
threshold. If the image within threshold but there is an improper minutia then improper
minutiae can be deleted by removing the false minutiae post processing step. The second
method of centre determination is based on ridge count, Figure (5-4). The fingerprint
ridges are represented as some art of the sine wave in the figure cross section, and they
are shown in the figure as homocentric circles with the origin in the real centre of the
fingerprint. The number of through passes in the horizontal and vertical direction could
101
be computed, where the number of circle through passes in the centre of all circles is
greater than in outlying region. These through passes define the ridge count for each
column or row. The following expression can be used for the computation of vertical
ridge count:
where Height is the number of pixels in the vertical direction and RCi is the ridge count
for the corresponding row in the image. For the selection of the vertical centre, the value
of RCV needs to be computed:
( (
RCV = avg max RCV , all )) 5-7
which represents the coordinate position CY and is computed as an average of the region
with maximal value of the ridge count from the whole set RCV , All . Similar equations can
( (
RC H = avg max RC H , all )) 5-9
102
Figure 5-4 Fingerprint ridge counts
Another method of computing the centre point is based on Orientation Field (OF), where
fingerprint can be viewed as an oriented texture image [108]. The OF is used to compute
the optimal dominant ridge direction in each w × w window or block. OF has been
proposed in several literatures [2, 108, 109]. The main steps in determining the
orientation image using the algorithm based on the least mean square iteration method are
as follows [105]:
Divide the input fingerprint image into blocks of size w × w . For 500 dpi images, the
initial recommended value of w is 16.
Compute the gradients ∂ x (i, j ) and ∂ y (i, j ) at each pixel (i, j ) . Depending on
computational requirements, the gradient operator may vary from the simple Sobel
operator to the more complex Marr-Hildreth operator.
Estimate the local orientation of each block centered at (i, j ) using the following
equations [77, 105]:
103
i+
w
j+
w 5-10
2 2
v x (i, j ) = ∑ ∑ 2∂ x (u, v )∂ y (u, v )
w w
u =1− v= j−
2 2
i+
w
j+
w 5-11
2 2
v y (i, j ) = ∑ ∑ ∂ 2x (u , v ) − ∂ 2y (u, v )
w w
u =1− v= j−
2 2
1 ⎛ v y (i, j ) ⎞ 5-12
θ (i, j ) = cot⎜⎜ ⎟
2 ⎝ v x (i, j ) ⎟⎠
where θ (i, j ) is the least square estimate of the local ridge orientation at
the block centered at pixel (i, j ) . Mathematically, it represents the direction
that is orthogonal to the dominant direction of the Fourier spectrum of
the w × w window.
Due to noise, corrupted ridge and valley structures, unclear minutiae, etc., in the input
image, the estimated local ridge orientation θ(i, j) , may not always be correct. Since local
ridge orientation varies slowly in a local neighbourhood where no singular points appear,
a low-pass filter can be used to modify the incorrect local ridge orientation. In order to
perform the low-pass filtering, the orientation image needs to be converted into a
continuous vector field, which is defined as follows [77]:
2 2 5-13
Φ 'x (i, j ) = ∑ ∑ h(u, v )Φ x (i − uw, j − vw)
wΦ w
u =− v=− Φ
2 2
2 2 5-14
Φ 'y (i, j) = ∑ ∑ h(u, v )Φ y (i − uw, j − vw)
wΦ w
u =− v=− Φ
2 2
104
w Φ × w Φ specifies the size of the filter.
5-15
1 ⎛⎜ Φ y (i, j ) ⎞⎟
'
O(i, j ) = cot
2 ⎜ Φ 'x (i, j ) ⎟
⎝ ⎠
Compute the consistency level of the orientation field in the local neighborhood of block
(i, j ) by the following formula:
C (i, j ) =
1
∑ ( )
O i ' , j ' − O (i, j )
2 5-16
n
(i , j )∈D
' '
(( )
d = O i ' , j ' − O(i, j ) + 360 o mod 360 o ) 5-18
( )
number of blocks within D ; O i ' , j ' and O(i, j ) are local ridge orientations
( )
for blocks i ' , j ' and (i, j ) respectively.
If C (i, j ) is above a certain threshold Tc , then the local orientation in this block is re-
After the orientation field of an input fingerprint image is determined, Figure (5-5), an
algorithm for the localization of the region of interest is applied, based on the local
certainty level of the orientation field.
105
Figure 5-5 Original fingerprint image with its result of orientation field computation
The result is the located region of interest within the input image. The level of certainty
of the orientation field in the block (i, j ) is defined as follows:
ε (i, j ) =
1
⋅
(
v x (i, j )2 + v y (i, j )2 ) 5-19
w× w vc (i, j )2
i+
w
j+
w 5-20
2 2
vc (i, j ) = ∑ ∑ ∂ 2x (u, v ) − ∂ 2y (u, v )
w w
u =i − v= j−
2 2
Based on orientation field calculation, the computational of the center consists of the
following steps:
For the estimation of the centre of the fingerprint, some reduction of the number of
directions needs to be done. Normally, 8 possible directions are used in each block w × w
[109]. These directions are shown in Figure (5-6). The directions have the following
106
and 8 = 112.5o . This number of directions is necessary for the classification. But for the
fingerprint centre computation, the number of directions could be reduced. In our case,
only 4 directions are sufficient, namely [1, 3, 5 and 7]. The direction 1 and 5 remain
without change. The directions 2 and 4 are assigned to the direction 3. The directions 6
and 8 are assigned to the direction 7. Now, each direction has the angle resolution of 45o .
The orientation field with only 4 directions is called O4 R .
The fingerprint image needs to be divided into four uniform blocks Figure (5-7). These
blocks can be considered as the particular blocks of the coordinate system, with the same
centre in the middle of the fingerprint image. The origin of the image could be defined in
the upper left corner, i.e. in the position [0,0] ; and the end of the image could be in the
lower right corner, i.e. in the position [m, n] . The procedure for gravity centre
computation would be as follows:
⎛ d ⎞ 5-21
(r + r )
C x (1h ) 2 = max⎜ ∑ O4(hr ) (i, j )⎟, i = a K b
⎜ ⎟
⎝ j =c ⎠
⎛ b ⎞ 5-22
(r + r2 )
C y (1h ) = max⎜ ∑ O4(hr ) (i, j )⎟, i = c K d
⎜ ⎟
⎝ j =a ⎠
107
(r
where C x (1h )
+ r2 ) (r
is the x position and C y (1h )
+ r2 )
is the y position of the orientation field
direction h in the image blocks r1 and r2 . The term O4(hr ) (i, j ) denotes the value of the
orientation field at the point (i, j ) . To determine the centre of the fingerprint, it is
necessary to compute the centres of gravity for the dedicated orientation field directions.
The gravity centre points can be connected and expressed as continuous lines. The
intersections of two lines with perpendicular directions are computed. These intersections
create a short abscissa and the middle point of this abscissa denotes the centre point of the
fingerprint. The final result of acquirement stage is a vector of extracting minutiae points
and reference centre point. This vector will be used as input data for crypto key
generation stage, to construct a direct minutiae point's key.
108
key in simple way. A reconstruction of minutiae points will be used to make these points
more secure to generate a combined encapsulated cryptographic key based on reforming
graph and adjacency matrix of extracted minutiae data.
The process of generating the key comprises all necessary functions to achieve no-
repudiation, encryption, digital signing and strong authentication in an open network [11].
The major novelty of proposed CBCG consist of keeping minutiae points away from
several attacks as a first security level of crypto-key generation life cycle. Extracted
minutiae set μ , Equation (5-2) grouped into tracing neighboring contours, a neighboring
relation defined by upper and lower level of contours. Using graph relation based on
vertices and edges, i.e. minutiae set, ridge connection lines, respectively, the CBCG
formulate a minutiae graph within specific conditions. A connection line must be only
between two neighboring contours of fingerprint extracted information; all vertices must
be visited within that contour. CBCG using the graph relation of vertices and edges is
109
constructed as shown in Figure (5-9). Key generation will be studied into two scenarios:
First, including detected singular point (SP); Second scenario, SP will be excluded.
120
100 8
Y- Coordinate
80
60 6 Upper contour
7
40 Main contour
4 5
20 1 Lower contour
2 3
20 40 60 80 100 120 140
X- Coordinate
In the experimental graphical phase, the minutiae points were extracted from cropped
(200x200) image size. Extracted minutiae μ (N × M ) pixels are grouped Figure (5-10)
according to their coordinate ( x, y ) within contours.
μA 5-23
CQ =
CW
where CQ is counters quantity, μA is minutiae points area, and CW is contour width.
End
110
In the mathematical and key generation phase, adjacency matrix formatted within traced
and grouped points (vertices) according to their connections order (edges) within the
following rules:
there exists a path from vi → v j , aij = 0 otherwise. This is illustrated in Figure (5-11).
⎡0 1 1 1 1 1 0 0⎤
⎢1 0 1 1 1 0 0 0⎥
⎢ ⎥
⎢1 1 0 0 1 0 0 0⎥
⎢ ⎥
⎢1 1 0 0 1 1 1 0⎥
⎢1 1 1 1 0 1 1 0⎥
⎢ ⎥
⎢0 0 0 1 1 0 1 0 ⎥
⎢0 0 0 1 1 1 0 0 ⎥
⎢ ⎥
⎢⎣0 0 0 0 0 0 0 0⎦⎥
Figure 5-11 Adjacency Matrix for the given graph in Figure (5-9)
The output matrix is taken as an input for crypto generator to be processed by defined
mathematical operations. These mathematical operations generate vectors and sub vectors
which will be dependent on cryptographic module algorithms, existing modules such as
symmetric (DES, 3DES) are considered [8]. Another scenario is to partition the matrix
into sub-matrices, those could be used as secure encapsulated headers as shown in Figure
(5-12), without de-capsulation previous headers, cipher text cannot be decrypted into
plain one. Suggested encapsulation technique working as associated headers, that change
the plain text formatting shape in type of encryption style, forwarding can be thought of
one or more messages (locked text) inside locking header. This is illustrated in Figure (5-
12). By applying entire summation of previous generated matrices and finding prime
numbers vector, the largest primes can be used for crypto-module algorithm such as RSA
[8, 40]. Applying RSA rules of encryption and digital signatures generation within its
privileges offer maximum security due to the involved huge key size.
111
Figure 5-12 Encryption encapsulation technique, where MR is matrices regenerator,
VHR is vector header generator
112
Figure 5-13 Adjacency matrices dimension
Because of different sizes of generated matrices, the key strength will be higher resistant
brute force attack. Uniqueness of the key will be determined by the uniqueness of the
fingerprint minutiae used in the key. Applying string matching algorithm on the
generated matrices, it is found that 100% uniqueness on both cases as input to the crypto
module phase. The protocols of FVC2002 is used to evaluate the False accept Rate (FAR)
and Genuine Accept Rate (GAR) for overall phases. FAR is ratio of the number of false
acceptances divided by the number of identification attempts, while GAR is the ratio
number of true positive parameter. Using these parameters, we have plotted the receiver
operating characteristic (ROC) curves of both cases when implemented as core point
detection as well as without core detection (see Figure (5-14)).
113
Figure 5-14 ROC curves estimated for both cases
The curves in Figure (5-14) show that 100% ratio of both scenarios (with and without
singular point). At some points of threshold, first case (area surrounding core point)
shows an improvement performance compared to the other case (without singular point
detection). Results show that key generation depends completely on quality assurance of
images and perfect minutiae extractor; empirically a minutiae detector (MINDTCT)
released by NIST fingerprint image software 2 [37] is suggested. MINDTCT is standard
software, automatically locates and records ridge ending and bifurcation in fingerprint
images, and it includes minutiae quality assessment based on local image condition
114
image [110]. RP is identified by its symmetry properties, and is extracted from the
complex orientation field estimated from the global structure of the fingerprint, i.e. the
overall pattern of the ridges and valleys. Complex filters, applied to the orientation field
in multiple resolution scales, are used to detect the symmetry and the type of symmetry.
RP detection algorithm is mathematically represented bellow:
(
z = f x + if y 2 ) 5-24
respectively.
The tensor is implemented by convoluting the grey value image with separable Gaussians
and their derivatives. Already the calculation of the tensor implies complex filtering
[111].
⎧ ⎧ x2 + y2 ⎫ 5-25
c( x, y ) = ( x + iy ) . exp⎨− ⎨
m
⎬
⎩ ⎩ 2σ ⎭
2
z ( x, y ) 5-26
z ' ( x, y ) =
z ( x, y )
where z ' is the normalized complex tensor and present the angel of this
field.
4. Only one convolution operation is performed between the filter and the angles of
the complex orientation tensor field as following:
115
w w
2 2
z ' ' ( x, y ) = ∑ ∑ C (u, v ) ⋅ z ' (x − wv, y − wu )
w w
u =− v=−
2 2
where z ' ' is the magnitude of the filter response that is applied to the
complex orientation tensor field, C is the mask of the complex filter and w
is the width of that mask.
The aim is to trace all pixels to find the maximum value and to assign its ( x, y )
coordination to the core point i.e. reference point. Fingerprint feature extraction relies on
detected RP at the centre of the distance between the extracted points and RP, by
applying the Crossing Number (CN) concept. CN extracts the ridge points from the
skeleton image by examining the local neighbourhood of each ridge pixel using a 3x3
window, Equation (5-1). Extracted minutiae μ ( NxM ) points which contain
μ = { μ i | μ i = ( xi , yi , t i , d i ) | i = 1...n μ } 5-27
where xi is the x-coordinate position, yi is the y-coordinate position, t i is the type and
d i distance of a particular minutiae, Equation (5-27) differ from (5-2) by using point
distance. Distance defined according to the Euclidean distance, and computed between
the extracted minutiae and the reference point:
D= ( x r − x m )2 + ( y r − y m )2 5-28
where (x r , y r ) is the reference point coordination and (x m , y m ) are the minutiae point
coordinates. Table (5-2) is showing some template information
116
Figure 5-15 Basic block diagram
x y t d
9 124 6 85.094
14 182 2 96.519
24 115 2 71.197
24 182 2 88.408
28 144 6 67.912
30 152 2 68.352
34 24 6 120.07
34 143 6 61.847
36 182 2 79.246
39 154 2 60.836
117
The following procedure builds a slicing window based on principal of choosing first
window as a first region of interest surrounding RP, empirically, it was chosen to be
(64 × 64) window and the following windows will be its doubling (128, 256, 512 ) :
Window size=64x64
End
End
118
According to the template area size, there will be at least 4 slicing windows, vector will
be slicing window size multiplied by minutiae points’ quantity, Generated vectors will be
used for header locker key and encryption provider key usage. Header locker key (HLK)
will be produced by V1, V3 concatenating, while encryption provider key (EPK) by V2,
V4 concatenating, Figure (5-17).
Figure 5-17 Generated keys, where HLK is Header Locker Key, EPK is Encryption
Provider Key.
The stability of the generated keys will be dependent on the verified distinguished
fingerprint extracted features from aligned qualified fingerprint images. HLK will be
used as encrypted text closing key. Without passing this key the system cannot deal with
the entire encryption procedure provided by EPK. EPK is a source key that will be used
on either DES or RSA encryption algorithm sources.
119
Table 5-3 Average of sub and whole key sizes
The entropy of applicable system feed by HLK and EPK is depending on secure system
construction, which is in slicing window analysis (SWA) case has two secure criteria,
cipher header closing as a part of file encryption certificate and plain text encoding
instead of developing simply longer cryptographic keys to resist brute force attacks.
SWA parts serve as infrastructure key for merging cryptography and biometrics. Tests
were done on chosen fingerprint images with perfect quality and that impossible to find
in practice, because fingerprint could not be identical from scanning to scanning, since
measurement errors are inescapable when the fingerprint is scanned. Fingerprint will
never be used as a seed of private key unless we can convert fingerprint to just one and
the same identification in real time
120
KEY s1 s2 s3 s4 s5 s6 s7 s8 s9 s10
s1 1 0 0 0 0 0 0 0 0 0
s2 0 1 0 0 0 0 0 0 0 0
s3 0 0 1 0 0 0 0 0 0 0
s4 0 0 0 1 0 0 0 0 0 0
s5 0 0 0 0 1 0 0 0 0 0
s6 0 0 0 0 0 1 0 0 0 0
s7 0 0 0 0 0 0 1 0 0 0
s8 0 0 0 0 0 0 0 1 0 0
s9 0 0 0 0 0 0 0 0 1 0
s10 0 0 0 0 0 0 0 0 0 1
Table 5-4 Uniqueness of generated keys where logical 1 (true) value indicates full
matching and logical 0 (false) otherwise.
5.5 Summary
Approaches for generating biometric cryptographic keys for merging cryptography and
biometrics have been presented. They take advantage of fingerprint template extracted
information and standard encryption algorithms to provide a novel way of generating
cipher keys without having to remember complicated sequences which might be lost,
stolen, or even guessed. In addition these approaches provide encouraging prospects to be
used as a platform for stable fingerprint extracted features; otherwise it could be used as
seed of public key infrastructure (PKI) in which the private key is generated on a carry on
device, e.g. smart card at the event that the legitimate user gives as seed of private key to
his carry device in order to sign a message. To overcome key repeatable problems, a
combination of fuzzy commitment prosperities and generation technique will be useful.
To reduce fingerprint quality and dependence on alignment, Fuzzy scheme extraction and
or vault generation will be useful too. In this case additional work will be performed to
see if fingerprint parameters or classifiers may serve as a more stable and unique
fingerprint biometric feature.
121
Chapter 6 Fuzzy Vault Cryptography Key
Structure
6.1 Introduction
Crypto-biometric system [5, 15] has recently emerged as an effective process for key
management to address the security weakness of conventional key generation, release,
and binding systems using traditional password, token or pattern recognition based
biometrics systems. It intends to bind a cryptographic key with user’s biometric
information in a manner to meet the following requirements [45, 61] of distortion
tolerance, discrimination and security, see chapter (2.3):
Discrimination is the ability of the system to distinguish all users of the system
and output different keys for different users.
Security of the system means that neither the key, nor the user’s original
biometric information can be extracted or calculated when the stored information
is compromised.
Chapter 5 showed techniques of key generation from biometric data in which the key is
extracted directly from the biometric information. Crypto key generation algorithms have
a very high proven security, but they suffer from the key management problem. However,
there are two main problems with those methods First, as a result of changes in the
biometric image due to environmental and physiological factors, the biometric template is
generally not consistent enough to use as a cryptographic key. Secondly, if the
cryptographic key is ever compromised, then the use of that particular biometric is
122
irrevocably lost. In a system where periodic updating of the cryptographic key is required,
this is catastrophic. One of the main challenges in direct key generation approaches is to
maintain the entropy of the key and keep the security of the biometric information
simultaneously. The principle obstacle for direct crypto biometric key is the inherent
variability of user biometric and to overcome this, crypto key generation moved from
direct to the fuzziness binding approaches. The fuzziness principle of construct vault
reformed from the fuzziness of fingerprint information. The development of fuzzy
construct vault was started from commitment reconstruction, where the fuzzy
commitment scheme was first proposed in [61] to integrate well-known error-control
coding and cryptographic techniques to construct a novel type of cryptographic system.
Instead of an exact, unique decryption key, a reasonable close witness can be accepted to
decrypt the commitment. This characteristic makes it possible for protecting the
biometric data using traditional cryptographic techniques. However, since the fuzzy vault
used in this scheme does not have the property of order invariance, any elements missing
or adding will result in the failure of matching. To overcome this problem, [45] proposed
a new architecture, which possesses the advantage of order-invariance. At the same time,
the author suggested that one of the important applications of the fuzzy commitment is
secure biometric systems. Following this direction, [63] employed the fuzzy vault scheme
on a secure smartcard system, where the fingerprint authentication is used to protect the
private key. In the biometric cryptosystem, the secret information is hidden as
coefficients in a polynomial, which acts as the frame of the fuzzy commitment. The fuzzy
vault construct is a biometric cryptosystem that secures both the secret key and the
biometric template by binding them within a cryptographic framework. The fingerprint
vault construction is based on the assumption that the fingerprint features are extracted
and well aligned in a black box. The work in this chapter will address the management
analysis problems of fuzzy vault crypto structure, a new capsulation approach based on
fingerprint fuzzy vault (FFV) will be proposed. FFV will be navigated through anatomy
and attack, a performance evaluation of FFV will be demonstrated through out this
chapter. This was the motivation to investigate the tolerance necessary for FFV to
function, to see the effect of different vault and tolerance parameters and to determine the
consequences on varying several of the vault and tolerance parameters.
123
6.2 Fuzzy Vault Anatomy
Fuzzy vault scheme (FVS) [45] was developed and built upon the ideas of the fuzzy
commitment scheme [61]. The FVS consists of two parts, encryption, and decryption
Figure (6-1, 2).
124
data points (= X ) in the input template to determine f ( X )(= Y ) . These ( X , Y ) pairs,
known as true points, constitute the locking set of what is to become the fuzzy vault. To
hide the identity of the true points, many false points (chaff) are then added to the set of
true points. This completes the fuzzy vault, which is then stored. The security of the
fuzzy vault scheme is based upon the difficulty of the polynomial reconstruction problem,
or as described later, the problem of decoding Reed-Solomon codes. For an overview of
research related to cryptography based on polynomial reconstruction, see [114, 115]. To
unlock the vault and recover the message, the data points ( X ') from the “live” template
(the unlocking set) are used for decryption. If a substantial number (i.e. within the
symbol-correcting capability of the system) of these data points overlap (after error-
correction) the true points in the stored vault, then the message can be successfully
recovered. The main advantage to this system is that the order of the data points does not
matter. Also, it can be shown to be secure, if there are sufficient chaff points in the vault
relative to the number of true points.
A Galois field is a finite field with order q = p n elements where p is a prime integer
and n ≥ 1 . By definition, arithmetic operations (addition, subtraction, multiplication,
division, etc.) on field elements of a finite field always have a result within the field. An
element with order (q − 1) in GF (q ) is called a primitive element in GF (q ) . All non-zero
elements in GF (q ) can be represented as (q − 1) consecutive powers of a primitive
125
( )
element (α ) . All elements in GF 2 m are formed by the elements {0,1, α } . Taking the
( )
field GF 2 3 and generator polynomial x 3 + x + 1 = 0 , the elements of the field can be
calculated, starting with an element called α which is called the primitive root (in this
case, α = 2 = x ). All elements of the field (except 0) are described uniquely by a power of
( )
α. For any finite field GF 2 n , α 2n −1 = α 0 = 1 . In this case, the field is constructed as
follows [116]:
α0 = x0 1 001 1
α1 = x x 010 2
α2 = x ⋅ x x2 100 4
α3 = x3 x +1 011 3
α 4 = α ⋅ α 3 = x ⋅ (x + 1) x2 + x 110 6
( )
α5 = α ⋅ α 4 = x ⋅ x 2 + x = x 3 + x 2 (x + 1) + x 2 111 7
( )
α 6 = α 2 ⋅ α 4 = x 2 ⋅ x 2 + x = x ⋅ (x + 1) + (x + 1) x2 +1 101 5
( )
α 7 = α ⋅α 6 = x ⋅ x 2 + 1 = x3 + x (x + 1) + x 001 ( )
1 =α0
8 7
α = α ⋅ α = α ⋅1 = α and the cycle repeats
Galois fields are used in a variety of applications such as linear block codes, classical
coding theory and in cryptography algorithms.
126
( )
with specific roots in GF 2 m . Inherited from the generator polynomial, these roots are
common to every codeword.
As shown in Figure (6-3), the difference, (n − k ) (called 2t), is the number of parity bits
that are appended to make the encoded block, with t being the error correcting capability
(in symbols). A Reed-Solomon codeword is generated using a special polynomial. All
valid codewords are exactly divisible by the generator polynomial which has the general
form:
( )( ) (
g ( x ) = x − α i x − α i +1 K x − α i + 2t ) 6-1
c( x ) = g ( x ) ⋅ i ( x ) 6-2
where c( x ) is the generator polynomial, i( x ) is the information block, c( x ) is a valid
codeword and is referred to as primitive element of the field.
Example: Generator for RS(255,249) showing the general form and expanded polynomial
form.
( )( )( )( )( )(
g (x ) = x − α 0 ⋅ x − α 1 x − α 2 ⋅ x − α 3 x − α 4 ⋅ x − α 5 )
g ( x ) = x 6 + g 5 x 5 + g 4 x 4 + g 3 x 3 + g 2 x 2 + g1 x1 + g 0
From the example, it can be seen that the original terms are expanded and simplified.
The g coefficients (g 5 , g 4 , g 3 , g 2 , g1 , g 0 ) are constants made up of additions and
127
5
multiplications of α 0 , α 1, α 2 , α 3 , α 4 and α and can be computed using Galois field
computations. Reed-Solomon codes are cyclic codes but are non-binary, with symbols
made up of m-bit (m > 2 ) sequences. RS codes achieve the largest possible code
minimum distance for any linear code with the same encoder input and output block
lengths. The distance between two code words for non binary codes is defined as the
number of symbols in which the sequences differ. Given a symbol size s, the maximum
codeword length (n) for an RS code is: n = 2 s − 1. Given 2t parity symbols, an RS code
can correct up to 2t symbol errors in known positions (erasures) or detect and correct up
to t symbol errors in unknown positions.
6.3.3Welch-Berlekamp Algorithm
The Welch-Berlekamp algorithm is one of algebraic methods for decoding Reed
Solomon codes. It can be thought of as a kind of curve fitting process of points, and a
curve can be constructed to fit any k points. When two or more points are added, the
curve must fit at least k+1, but the curve is allowed to miss one of the points. After
adding another two points, the curve must fit at least k+2 of them. When eventually all n
points have been considered, the curve must fit at least (n + k ) 2 of them. For more
explanation, suppose that Alice sends Bob a message over a noisy channel. When Bob
receives the message, some of the transmitted packets have been corrupted, but it is not
known which packets are corrupt and which are not. Using RS encoding, Alice must
transmit (k + 2t ) characters to enable Bob to recover from t general errors. Therefore, the
message is encoded as a polynomial P ( x ) of degree (k − 1) such that: c j = P ( j )
128
Given a set of points over a finite field {(z i , yi )}in=1 , and parameters [n, k, w], recover all
polynomials p of degree less than k such that p( z i ) ≠ yi , for at most w distinct indexes,
E ( x ) = ( x − e1 )( x − e2 )K ( x − ek ) 6-3
At exactly the t points at which errors occurred, E(x) = 0. For all (k + 2t) points where 1
≤ x ≤ (k + 2t), P ( x ) ⋅ E ( x ) = R (x ) ⋅ E ( x ) . At points x at which no error occurred, this is true
because P(x) = R(x). At points x at which an error occurred, this is true because E(x) = 0.
Let Q( x ) = P (x )E ( x ) . Specified by (k + t) coefficients, Q(x) is a polynomial of degree (k
+ t – 1). Described by (k + 1) coefficients, E(x) are a polynomial of degree t. There are
only t unknowns because the coefficient of x t is 1. There are also (k + 2t) linear
equations in Q (x ) = R ( x )E ( x ) for 1 ≤ x ≤ (k + 2t). For these equations, the unknowns are
the coefficients of the polynomials Q(x) and E(x). The known values are the received
values for R(x). The BW algorithm is illustrated by the following example (non-finite
fields are used to simplify the calculations):
The information packets to be sent are “1”, “3”, and “7” (therefore, k = 3). By
interpolation, we find the polynomial:
P(x ) = X 2 + X + 1 6-4
P(0) = 0 2 + 0 + 1 = 1,
P(1) = 12 + 1 + 1 = 3,
P(2 ) = 2 2 + 2 + 1 = 7.
129
To be able to correct for one error (i.e., t = 1), (k + 2t), or 5, packets are transmitted (2
redundant):
Now, assume P (1) is corrupted and 0 is received, instead of 3, in that packet. When
correcting for a single error, the error-locator polynomial is: E(X) = X – e, where e is not
yet known. R(X) is the polynomial whose values at 0,K,4 are those received over the
channel (1, 0, 7, 13, 21).
As previously described:
P (x )E ( x ) = R( x )E ( x ) 6-5
for X = 0,1,K,4 . Although P and E are not known (although it is known that P is a
second-degree polynomial), the above relationship can be used to obtain a linear system
of equations whose solution will be the coefficients of P and E.
Let
Q( x ) = P(x )E ( x ) = aX 3 + bX 2 + d 6-6
where a, b, c, d represent the unknown coefficients to be determined. Also,
aX 3 + bX 2 + cX + d = R( X )E ( X ) = R( X )( X − e ) 6-7
which can be rewritten as:
aX 3 + bX 2 + cX + d + R( X )e = R( X )X . 6-8
Five linear equations are generated when substituting X = 0, X = 1, ,K, X = 4 into the
above formula:
130
a(4 )3 + b(4)2 + c(4 ) + d + (21)e = 21(4); 64a + 16b + 4c + d + 21e = 84.
The encryption portion of the system is the creation of the fuzzy vault for the message. A
template created from multiple images of the same fingerprint is used as a cryptographic
key to encode a message defined by the coefficients of a polynomial. Data points that
represent the polynomial are stored in the fuzzy vault. Many random data points (chaff)
are added to the vault to hide the identity of the true polynomial data points. MINDTCT
is used to create the fingerprint template Figure (6-5).
131
Figure 6-4 Fingerprint vault encryption implementation model
132
Extracted information contains: minutiae coordinate ( x, y ) and orientation angle (θ ) . To
obtain repeatable data points, only those data points found to occur (within a predefined
threshold) in more than half of impressions were used to create the input fingerprint
template. The X − value (codeword) for the true data points is calculated by
concatenating either (x y ), (x θ ), or ( y θ ), where the decryption process will concatenate
the identical data variables. Since it is desirable that all values be constrained to a finite
size, all symbols are defined to be within a finite field and all calculations are performed
using finite field operations. In practice, data communications (especially with error-
( )
correction) often use finite fields referred to as Galois Fields. In particular, GF 2 n fields
are used, where the 2 indicates that the field is described over binary numbers and n is the
degree of the generating polynomial [116].
The symbols of the message are encoded as the coefficients of a k-degree polynomial.
For example, the string “Mokhled”, or ASCII (77,111,107,104,108,101,100), could be
To hide the identity of the true points, many false points (chaff) are added to the vault
Figure (6-6). The false points are added far enough away from true points so they do not
cause attraction of values within the fuzziness (threshold distance) of the true points.
Also, they are placed outside the threshold distance of other chaff points since they would
otherwise be redundant. As a final step in the vault creation, all points in the vault are
sorted, resulting in a mixture of true and false points from which the true points must be
discovered when decrypting the message.
133
(a) True points distribution (b) Chaff points distribution
Chaff point generation is dependent on the number of chaff points and a predefined
threshold. The threshold is the radius from a vault point that a live point would match.
This value is given in integer normalized (x, y) coordinate units. Therefore, a value of one
134
Units Distance
1 1.414
2 2.828
3 4.243
4 5.657
5 7.071
6 8.485
7 9.899
8 11.314
9 12.728
10 14.142
11 15.556
12 16.971
Table 6-1 Unit to Euclidian distance equivalence
Codewords Selection
To reconstruct the message polynomial, the user must identify true codewords from the
vault, since the corresponding (X, Y) pairs define the polynomial. The X′ data is used to
select the true codewords from the vault. Since biometric data are expected to be inexact
(due to acquisition characteristics, sensor noise, etc.), X′ template values are matched to X
vault values within a predefined threshold distance, thus allowing for exact symbol
matching. This is the “fuzziness” built into the system, since multiple X′ values (i.e.,
those within the threshold distance of X values) will result in a single X value.
135
Figure 6-7 Fingerprint Vault Decryption implementation model (dashed box)
The message polynomial is attempted to be reconstructed using the (X, Y) pairs identified
by the live template. A valid live template may contain more/less/different minutiae than
those extracted when the original template was created. However, if there is significant
overlap of X and X′ codewords, the message can still be recovered by using a typical
telecommunications error-correcting scheme for recovery of data over a noisy channel,
such as a Reed-Solomon (RS) code. As reviewed earlier, RS (k, t) codes are those in
which codewords consist of t symbols and each codeword corresponds to a unique
polynomial p of degree less than k over the finite field F of cardinality q. Therefore, there
there exists a polynomial p (x ) of degree at most d, such that Yi = p( X i ) for all but k
136
values of ( X i , Yi ) . Using the BW algorithm, if 2k + d < m , this condition can be verified
by finding the solution for a linear constraint system:
N ( X i ) = Yi ∗ W ( X i ) , i = 1,2,K , m 6-9
where polynomial degree (W ) ≤ k
p ( x ) = N / W is the result polynomial after the 2k + d + 1 unknowns are calculated.
Message Recovering
The distribution of the number of successful message recoveries from 4368 simulations
[(8 number of true x 6 number of chaff points x 7 thresh x 13 the vary distance between
live points), done 3 times]:
137
zero one two three
2889 194 151 1134
Table 6-3 Successful message recovery
The simulation effects on successful message recovery, when varying the parameter
[Number of true points, Number of chaff points, Threshold, and live check points]; these
parameters were examined through a series of box plot. Box plot was chosen because it
provides excellent visual summary of all parameter distribution. In these plots, the box
has lines at the lower quartile (25%), median (50%), and upper quartile (25%) values.
The whiskers are lines extending from each end of the box to show the extent of the rest
of the data. Outliers are data with values beyond the ends of the whiskers. If there is no
data outside the whisker, a dot is placed at the bottom whisker. In each plot, Y-axis
parameters are plotted against number of successful recovered messages.
Figure 6-8 Effect of points parameter (a) true points, (b) chaff points
138
The number of true points from Figure (6-8 (a)) is between (30 & 55). Within the range
simulated, this parameter has little significant effect. This result is expected because the
number of true points is small in relation to the number of total vault points, i.e. the true
points represent a small proportion of the total vault points. Low median value is 40 for 0,
1 & 2 success. Within the parameter range, there is a small effect due to the number of
chaff points. As the number of chaff points increases, it is somewhat more difficult to
recover the message, as shown in Figure (6-8b), the increased median value of 300 chaff
points, when the message is never recovered (success =0). The median value is 200 when
the message is recovered at least once.
Figure (6-9) shows that, as the value of the threshold parameter increases, the success rate
increases. It shows the median value for no message recovery is 2 and the median value
for all messages recovered is 4. The upper quartile for successes [1, 2 and 3] is identical.
This parameter is clearly shown to be positively correlated with success since the greater
the threshold parameter the more tolerance for matching true points.
Capsulation pre-processing
It is well-known that for encryption, keys at both the sender and receiver sides must
match exactly. However, repeated capture of biometric data from the same subject
usually does not result in identical data in each capture, but similar feature extraction.
This is due to several factors, including sensing errors, alignment errors, presentation
angles, finger deformation, skin oils, dirt, etc. Because of this inexact reproducibility, a
method is needed to “correct” the data before it is presented to the processing subsystem
in order to obtain reproducible results. This can be accomplished by applying error-
correcting codes [48], as a common practice when recovering messages transmitted over
a noisy channel. Yoichi, et al. [119] proposed a statistical A/D conversion as effective
140
scheme to convert biometric data to just one identification number; their solution could
be used as another scenario to overcome the template reproducibility problems. Either
[48] or [119] are useful for preparing minutiae based templates as a first stage of
capsulation approach Figure (6-10).
TM = (( x || y ) θ ) 6-10
A chaff point algorithm, Figure (6-11), is used to generate random chaff points (CP) to
add to the TM based on union relation to generate a total points set (TP).
TP = TM U CP 6-11
141
Chaff Point Algorithm
Threshold distance
Output: TP (total points) – Array containing all true and chaff points.
Point uniformly and randomly chosen from the coordinate's domain (0..216 )
Do while conditions
Conditions:
If not {
Select it}
TP = TM U CP
Return
End
Figure 6-11 Chaff point generation algorithm
The TP servers as first capsulation shield. Next, a construct vault V(TP ) is computed from
the TP, and the injected secret key (SK), Figure (6-12). V(TP ) could be used as a second
capsulation shield and it is stored in the header of the encrypted file in clear for vault
reconstruction usage at the decryption stage.
142
Vault Construction Algorithm
Algorithm
V(TP ) ← (SK encoded message as coefficients of polynomial, Galois field array from TP )
Do compute
% Galois field array created from SK in the field GF 2 m , for 1<=M<=16. The
Return V(TP )
End
Figure 6-12 Vault construction algorithm
Finally, a part of encrypted vault servers as file header for the encryption usage, or could
be stored in the header of the encrypted file, which it will be the final encapsulation
shield.
143
successfully recovered, as well as partially vault (header file key) will be released from
final shield of encrypted file. In decapsulation portion of the security capsulation
algorithm, a FIFO (first in first out) rule must be applied for the whole decryption process.
Follow the principles, the constructed vault will be unshielded, decapsulated to the key
release, finally the bound secret key, and header key are released Figure (6-13).
144
It could be concluded that, the classes of attacks against BKC approach include: brute
force attack against all BKC shields (i.e. EK, V, and TM), and chaff point identifications
to find the original ones [120]. For example, the vault could be attacked by brute-force
method, bf (r , t , k ) , where r is the total number of points, t is the number of real points,
and k is the degree of the polynomial. For an attacker, r and t are of the same length as
the ones in the vault parameter, however for a valid user, r is the size of their unlocking
set and t is the number of non-chaff points in that set. The complexity of brute force ( C bf )
−1 6-12
⎛ r ⎞⎛ t ⎞
C bf = ⎜⎜ σ ⎟⎟⎜⎜ σ ⎟⎟
⎝ ⎠⎝ ⎠
Figure 6-14 The relationship between chaff points, minimum distance and release-
ability of locked key.
A basic anatomy of vault unlocking can be viewed in two contexts. The first is the
complexity of a valid user unlocking a vault with a matching fingerprint image. One goal
is to minimize this complexity. The second context is the complexity of an attacker
without fingerprint information trying to crack the vault. All researchers in this field wish
to maximize this complexity while the attacker wishes to minimize it. Figure (6-15)
shows that a higher level of security is related to higher degree of polynomial as well as a
maximum number of chaff points. However, it is clear that a higher complexity could be
achieved with maximum values of vault parameters.
146
Figure 6-15 The relationship between chaff points, Polynomial degree, vault
complexity
147
Figure 6-16 Fingerprint Vector Features scheme
6.9.1Preprocessing
The preprocessing stage contains three main steps: Centre Point Determination, Cropping,
Sectorization and normalization the region around the reference point.
Determine the x and y magnitudes of the gradient at each pixel in each block, Gx
and Gy.
With each block, compute the slope perpendicular to the local orientation field
using equation (6-13)
148
⎛ 16 16 ⎞ 6-13
⎜ ∑ ∑ 2G x (i, j )G y (i, j ) ⎟
1 ⎜ i =1 j =1 ⎟ π
Θ = tan −1 ⎜ ⎟+
2 ⎜ 16 16 2 ⎟ 2
⎜ ∑ ∑ G x (i, j ) − G y (i, j ) ⎟
2
⎜ i =1 j =1 ⎟
⎝ ⎠
Only looking at blocks with slopes with values ranging from 0 to π/2, trace a
path down until you encounter a slope that is not ranging from 0 to π/2 and mark
that block.
The block that has the highest number of marks will compute the slope in the
negative y direction and output an x and y position which will be the centre point
of the fingerprint.
The image is then cropped into three options of cropping images centred around this
pseudo –centre point.
149
be one giant sector. This will yield in an image that is more uniform. The following
equation is used for normalization of each pixel. A constant mean M 0 and variance
V0 are 100. i is the sector number, M i is the mean of the sector, and Vi is the variance of
the sector.
⎧ (
( ) 2 )
⎪M 0 + V0 I x, y − M i , if I ( x, y ) 〉 M
6-14
⎪ Vi
⎪
⎪
N i ( x, y )⎨
⎪
⎪ 0M −
( )
V0 I ( x, y ) − M i2 , otherwise
⎪ Vi
⎩⎪
The normalized image is then passed through a bank of Gabor filters. Each filter is
performed by producing a 33x33 filter image for 6 angles (0, π/6, π/3, π/2, 2π/3 and 5π/6),
and convolving it with the fingerprint image. Spatial domain convolving is rather slow,
so multiplication in the frequency domain is done; however, this involves more memory
to store real and imaginary coefficients. The purpose of applying Gabor filters is to
remove noise while preserving ridge structures and providing information contained in a
particular direction in the image. The sectoring will then detect the presence of ridges in
that direction. The Gabor filter has also an odd height and width to maintain its peak
centre point. The following is the definition of the Gabor filter [122]:
1 x'
2
y'
2 6-15
− { 2 + 2 }
2 σ ' σ '
G ( x, y , f ,θ ) = e ⋅ cos( 2πfx ' )
x y
where x ' = ( x cosθ + y sinθ ), y ' = (−x sinθ + y cosθ ) are rotated coordinates,
150
Feature Vector
After obtaining the 6 filtered images, the variance of the pixel values in each sector is
calculated. This will reveal the concentration of fingerprint ridges directions in that part
of the fingerprint. A higher variance in a sector means that the ridges in that image were
going in the same direction as is the Gabor filter. A low variance indicates that the ridges
were not, so the filtering smoothed them out. The resulting 360 variance values (6 × 60)
are the feature vector of the fingerprint scan. The following is the equation for variance
calculation. Fiθ are the pixel values in the ith sector after a Gabor filter with angle θ has
been applied. Piθ is the mean of the pixel values. Ki is the number of pixels in the ith
sector.
6-16
Viθ = ∑ (Fiθ (x, y ) − Piθ )2
Ki
The result will be three vector features for cropped images. A concatenation value of
these vectors will formulate the final used feature (6-17).
V = V1 V2 V3 6-17
Where this vector is used as true data point to replace the minutiae points' feeder in Fuzzy
vault scheme [45, 63-65] construction to generate the needed vault.
151
shows the relationship between polynomial degree and vault attack complexity where the
used extracted feature points are 21 points while chaff points were chosen to vary from
100 to 300 points. Figure (6-18) shows the relationship between chaff points, minimum
distance and release ability of locked key. The minimum distance was set to satisfy the
following rules: chaff points cannot be placed too close to real points, no reason to place
chaff points next to each others at any distance less than minimum distance, because the
attacker can immediately ignore them as unlikely candidates.
Figure 6-17 The attack complexity varies according to the degree of polynomial
152
Figure 6-18 The relationship between the key releasability and the minimum distance.
6.12 Summary
Biometric systems are being widely used to achieve reliable user authentication and these
systems will proliferate into the core information technology infrastructure. Therefore, it
is crucial that biometric authentication is secure. Fuzzy vault is one of the most
comprehensive mechanisms for secure biometric authentication and cryptographic key
protection. Fuzzy vault cryptography key structure investigated for the reason to obtain
and run the guidelines for appropriate vault parameters and system tolerance. A shielding
technique "capsulation" was proposed to overcome fingerprint fuzzy vault key
management and to increase strength of key against crack and attack possibilities.
Practical fuzzy vault system based on fingerprint vector features was proposed. It can
easily secure secrets such as 128-bit AES encryption keys.
153
Chapter 7 Conclusion and Future Work
7.1 Conclusion
Biometrics and cryptography have been seen as competing technologies and identified as
two of the most important aspects of digital security environment. Working separately,
the two technologies develop activities in isolation, sometime in competition with each
other. For various types of security problems the merging between these aspects has led
to the development of new bio crypt technology. Based on merging technique, the bio
crypt categorized into: (i) loosely-coupled mode, the biometric matching is decoupled
from the cryptographic part. Biometric matching operates on the traditional biometric
template: if they match, the cryptographic key is released from its secure location, e.g. a
server or smart card. (ii) tightly-coupled mode, biometric and cryptography are merged
together at a much deeper level, where matching can effectively take place within
cryptographic domain, hence there is no separate matching operation that can be attacked;
key extracted from a collected heterogeneous mass (key/bio template) as a result of
positive matching. Bio crypt is giving hope to an ideal technology combination and
security integration. The bio crypt process can be carried out in three different modes:
key generation, binding and construction.
The biometric key generation usually suffer from low ability to discriminate which can
be assessed in terms of key stability and key entropy. Key stability refers to the extent to
which the key generated from the biometric data is usually repeatable. Key entropy
relates to the number of possible keys that can be generated. While it is possible to derive
a key directly from biometric features, it is difficult to simultaneously achieve high key
entropy and high key stability.
154
reveal much information about the key or the biometric template, i.e., it is impossible (or
computationally infeasible considering cost and time limitations) to decode the key or the
template without any knowledge of the user’s biometric data. A bio crypt matching
algorithm is used to perform authentication and key release in a single step.
Biometric cryptosystems that work in the key binding or generation ways are difficult to
implement due to the large intra-class variations in biometric data, i.e., samples of the
same biometric trait of a user obtained over a period of time can differ substantially. For
example, in the case of fingerprints, factors such as translation, rotation, partial overlap,
non-linear distortion, pressure and skin condition, noise and feature extraction errors lead
to large intra-class variations.
The cryptographic construction mode is designed to work with biometric features which
are represented as an unordered set. The ability to deal with intra-class variations in the
biometric data along with its ability to work with unordered sets which is commonly
encountered in biometrics makes the construction mode “Fuzzy vault” a promising
solution for biometric cryptosystems.
The bio crypt key has the following benefits: (i) to increase the security of the system and
(ii) to enhance the privacy issues related to the biometric template and extracted feature
vectors. The bio crypt technology suffers from several limitations e.g. biometric image
based quality, validity, image alignment, cancelability, key revoking and repeatability.
The previous challenges affect the performance, accuracy and interoperability of any
developed bio crypt system based.
To circumvent the biometric image quality problems, three new non reference algorithms
were proposed:
155
object segmentation, background subtraction, total image threshold and pixel
weight calculation.
A Hybrid Method for Fingerprint Image Validity and Quality Computation. Here
both statistical and spectrum analysis are combined to detect the validity as well
as the quality of tested image
The biometric template information was used to generate and construct revocable and
cancelable key by:
157
may help raising the security level of the bio crypt to cryptographic acceptable
values.
Fuzzy vault construction using combined biometrics, e.g. fingerprint minutiae, iris
data will increase the capacity of cryptographic key and solve the key
management problem. A combining multi mode biometric features is promising
approach to enhance the vault security and reduce the false accept rate of the
system without affecting the false reject rate. Employing multimodal biometric
systems will overcome the accuracy and vulnerability limitations.
Vault unlocking can be viewed in two contexts. The first is the complexity of a
valid user unlocking a vault with a matching fingerprint image. The second
context is the complexity of an attacker without fingerprint information trying to
crack the vault. Using combine error correction codes with stable and ordered
multi mode biometric templates will maximize attacking complexity and reduce
valid user unlocking computation complexity that could be promising future
research.
Bio crypt based system has several advantages over traditional password based
systems. Bio crypt vault aims to secure critical data (e.g. secret encryption key) with
the fingerprint data in a way that only the authorized user can access the secret by
providing the valid fingerprint, and some implementations results for fingerprint
vault have been reported. However, all the previous results assumed that fingerprint
features were pre-aligned, and automatic alignment in the fuzzy vault domain is
open and challenging issue, therefore, integrating align fingerprint features in the
domain of the fuzzy fingerprint vault systems could be future research direction.
158
References:
[6] A. Jain, L. Hong, and R. Bolle, "On-Line Fingerprint Verification," IEEE Trans.
Pattern Anal. Mach. Intell., vol. 19, pp. 302-314, 1997.
[8] B. Scheneier, Applied Cryptography, 2nd ed: John Wiley & Sons,New York,
1996.
159
[10] W. Stallings, Cryptography and Network Security: Principles and Practice:
Prentice Hall College, 2006.
[11] P. Reid, Biometrics and Network Security: Prentice Hall PTR, 2003.
[16] A. K. Jain and U. Uludag, "Hiding biometric data," IEEE Transactions on Pattern
Analysis and Machine Intelligence, vol. 25, pp. 1494-1498, 2003.
160
[19] M. S. ALTARAWNEH, W.L.WOO, and S.S DLAY, "OBJECTIVE
FINGERPRINT IMAGE QUALITY ASSESSMENT USING GABOR
SPECTRUM APPROACH," presented at DSP 2007, Wales, UK, 2007.
[21] "http://bias.csr.unibo.it/fvc2004/default.asp."
[22] "Micro and Nano Systems (MNS) research group at TIMA laboratory in Grenoble,
France," http://tima.imag.fr/mns/research/finger/fingerprint/index.html.
[23] E. Lim, X. Jiang, and W. Yau, "Fingerprint quality and validity analysis,"
presented at Proc. IEEE int. Conf. On image Processing, ICIP, Sept.2002.
[24] E. Lim, K.-A. Toh, P. N. Suganthan, X. Jiang, and W.-Y. Yau, "Fingerprint image
quality analysis," presented at Image Processing, 2004. ICIP '04. 2004
International Conference on, 2004.
[25] L. Shen, A. Kot, and W. Koo, "Quality Measures of Fingerprint Images,," Lecture
Notes in Computer Scienc, vol. Volume 2091, pp. 266, 2001.
[26] J. Qi, D. Abdurrachim, D. Li, and H. Kunieda, "A Hybrid Method for Fingerprint
Image Quality Calculation," in Proceedings of the Fourth IEEE Workshop on
Automatic Identification Advanced Technologies: IEEE Computer Society, 2005,
pp. 124-129.
[27] N. B. Nill and B. H. Bouzas, "Objective image quality measure derived from
digital image power spectra," Optical Engineering, vol. 31, pp. 813-825, April
1992.
161
[29] Y. Chen, S. Dass, and A. K. Jain, "Fingerprint quality indices for predicting
authentication performance," presented at AVBPA, Rye Brook, NY, July 2005.
[30] B. Lee, J. Moon, and H Kim, "A novel measure of fingerprint image quality using
the Fourier spectrum," Proceedings of the SPIE, vol. 5779, pp. 105-112, 2005.
[31] S. Joun, H. Kim, Y. Chung, and D. Ahn, "An Experimental Study on Measuring
Image Quality of Infant Fingerprints," LNCS, vol. 2774, pp. 1261-1269, 2003.
[35] E. Tabassi, C. Wilson, and C. Watson, "Fingerprint image quality," NIST research
report NISTIR7151, August, 2004.
[39] A. Bodo, "Method for producing a digital signature with aid of a biometric
feature." Germany: German patent DE 42 43 908 A1, 1994.
162
[40] P. Janbandhu and M. Siyal, "Novel biometric digital signatures for internet based
application," inf. Manage. Comput. Secur, vol. 9, pp. 205-212, 2001.
[44] G. I. Davida, Y. Frankel, and B. J. Matt, "On enabling secure applications through
off-line biometric identification," presented at IEEE Symposium on Security and
Privacy Proceedings, USA, 1998.
[45] A. Juels and M. Sudan, "A fuzzy vault scheme," presented at Information Theory,
2002. Proceedings. 2002 IEEE International Symposium on, 2002.
[48] G. I. Davida, Y. Frankel, B. J. Matt, and R. Peralta, "On the relation of error
correction and cryptography to an offline biometric based identification scheme,"
presented at Workshop Coding and Cryptography (WCC’99), 1999.
163
[51] F. Monrose, M. K. Reiter, and S. Wetzel, "Password hardening based on
keystroke dynamics," presented at Proceedings of the 6th ACM conference on
Computer and communications security, 1999.
[55] H. Feng and C. C. Wah, "Private key generation from on-line handwritten
signatures," Information Management & Computer Security, vol. 10, pp. 159-164,
2002.
[57] Y. W. Kuan, A. Goh, D. Ngo, and A. Teoh, "Cryptographic Keys from Dynamic
Hand-Signatures with Biometric Secrecy Preservation and Replaceability," in
Proceedings of the Fourth IEEE Workshop on Automatic Identification Advanced
Technologies, 2005, pp. 27-32.
164
[59] W. Zhang, Y.-J. Chang, and T. Chen, "Optimal thresholding for key generation
based on biometrics," presented at International Conference on Image Processing,
2004.
[65] U. Uludag, S. Pankanti, and A. K. Jain, "Fuzzy Vault for Fingerprints," presented
at Fifth International Conference on Audio- and Video-based Biometric Person
Authentication, Rye Twon, USA, 2005.
[66] U. Uludag and A. Jain, "Securing Fingerprint Template: Fuzzy Vault with Helper
Data " in Proceedings of the 2006 Conference on Computer Vision and Pattern
Recognition Workshop IEEE Computer Society, 2006 pp. 163
[67] Y. Chung, D. Moon, S. Lee, S. Jung, T. Kim, and D. Ahn, "Automatic Alignment
of Fingerprint Features for Fuzzy Fingerprint Vault," presented at Information
Security and Cryptology, Beijing, China, 2005.
165
[68] S. Lee, D. Moon, S. Jung, and Y. Chung, " Protecting Secret Keys with Fuzzy
Fingerprint Vault Based on a 3D Geometric Hash Table," presented at ICANNGA
2007, Warsaw, Poland, 2007.
[71] N. Ratha, S. Chen, and A. Jain, "Adaptive flow orientation based feature
extraction in fingerprint images," Pattern Recognition, vol. 28, pp. 1657-1672,
Nov. 1995.
[72] A. K. Jain, N. K. Ratha, and S. Lakshmanan, " Object Detection Using Gabor
Filters," Pattern Recognition, vol. 30, pp. 295-309, 1997.
[74] J. Yin, E. Zhu, X. Yang, G. Zhang, and C. Hu, "Two steps for fingerprint
segmentation," Image Vision Comput., vol. 25, pp. 1391-1403, 2007.
[76] N. Otsu, "A Threshold Selection Method from Gray-Level Histograms," IEEE
Transactions on Systems, Man, and Cybernetics, vol. 9, pp. 62-66, 1979.
[77] L. Hong, Y. Wan, and A. Jain, "Fingerprint Image Enhancement: Algorithm and
Performance Evaluation," IEEE Transactions on Pattern Analysis and Machine
Intelligence, vol. 20, pp. 777-789, 1998.
166
[78] F. Alonso-Fernandez, J. Fierrez-Aguilar, and J. O.-. Garcia, "A review of schemes
for fingerprint image quality computation," presented at 3rd COST- 275
Workshop, Biometrics on the Internet, European Commission, 2005.
[80] Z. Wang and A. C. Bovik, Modern Image Quality Assessment: Morgan &
Claypool, 2006.
[81] E. Tabassi and C. L. Wilson, "A novel approach to fingerprint image quality,"
presented at IEEE International Conference on Image Processing, 2005.
[82] Y. Chen, S. Dass, and A. Jain, "Fingerprint Quality Indices for Predicting
Authentication Performance.," presented at Audio-and-Video-based Biometric
Person Authentication, Rye Town, NY, 2005.
[86] A. C. Bovik, Handbook of Image and Video Processing: Academic Press, Inc.,
2005.
167
[88] R. J. Safranek, T. N. Pappas, and J. Chen, "Perceptual Criteria for Image Quality
Evaluation," A. Bovik, Ed.: Academic Press, 2004.
[89] S. Daly, "The visible differences predictor: an algorithm for the assessment of
image fidelity " in Digital images and human vision MIT Press, 1993 pp. 179-206
[91] M.S. Altarawneh, L.C.Khor, W.L.Woo, and S. S. Dlay, "A NON Reference
Fingerprint Image Validity via Statistical Weight Calculation," JOURNAL OF
DIGITAL INFORMATION MANAGEMENT, vol. 5, 2007.
[93] M. Carnec, P. Le Callet, and D. Barba, "Full reference and reduced reference
metrics for image quality assessment," presented at Seventh International
Symposium on Signal Processing and Its Applications, 2003.
[101] "Final report from the video quality experts group on the validation of objective
models of video quality assessment, phase II, March 2003.."
[103] W. Stallings, Cryptography and Network Security, 4 ed: Prentice Hall, 2006.
169
[109] M. D. Garris, C. I. Watson, R. M. McCabe, and C. L. Wilson, User's Guide to
NIST Fingerprint Image Software (NFIS), NISTIR 6813, 2001.
170
Processing, 2005. Proceedings. (ICASSP '05). IEEE International Conference on,
2005.
[120] E.-C. Chang, R. Shen, and F. W. Teo, "Finding the original point set hidden
among chaff," presented at ACM Symposium on Information, computer and
communications security, Taipei, Taiwan, 2006.
171