A Standoff System for Noncooperative Ocular Biometrics
Plamen Doynov and Reza Derakhshani, Ph.D.
Department of Computer Science Electrical Engineering
University of Missouri at Kansas City
Kansas City, MO 64110, USA
that affect image quality and overall system performance.
Section four describes the development of a standoff ocular
biometric system using system integration of commercially
available, off-the-shelf components. In section five, we report
the corresponding results for ocular images of 28 volunteers at
distances of up to 9 meters. In conclusion, we outline the key
attributes of the imaging system for standoff ocular biometrics,
the challenges we faced, and the future work based on the
lessons learned.
Abstract—The iris and more recently vascular patterns
seen on the white of the eye have been considered for
ocular biometrics. The non-contact nature, uniqueness,
and permanence of ocular features makes them promising.
Among new challenges are to develop commercial systems
for less constrained environments and at extended
distances. Such systems need to have minimal burden on
the user and be robust for non-cooperative users.
We present the design and development of standoff
system for noncooperative ocular biometrics using system
integration approach. Review of existing commercial and
experimental long-range biometric systems is presented.
The process of selection of sensors and illumination
techniques is described. The development of user
interfaces and algorithms for a working prototype is
explained. The performance is evaluated with images of 28
subjects, acquired at distances up to 9 meters. The
conflicting requirements for the design of this standoff
biometric system, and the resulting performance
limitations with impact on image quality are discussed.
II.
A. Standoff systems for Ocular Image Acquisition
At long distances, capturing the eye with sufficient
resolution and quality is a challenging proposition [15-17].
The challenges are elevated even further when the imaging
system has to work well without cooperation from the user.
Proenca et al. address important issues and trends in noncooperative iris recognition, and have created UBIRISv2
database of iris images captured “on the move” and “at a
distance” [18-20]. The authors use visible spectrum for
imaging as an alternative to the customary near infrared (NIR).
Wheeler et al. [21] describe a stand-off iris capturing system
designed to work at up to 1.5 m using a pair of cameras with
wide field of view for face localization and an iris camera to
capture the iris. Dong et al. [22] discuss the design of a system
to image the iris at a distance of 3 meters. The “Iris on the
Move” system of Sarnoff Corporation has also a reported
standoff distance of 3 meters. It is a portal-style system with a
210 mm, F/5.6 lenses [23]. Du et al. [24] describe and use the
IUPUI multi-wavelength database, acquired at 10.3 feet from
the camera to the subject using a MicroVista NIR camera with
Fujinon zoom lens.
AOptix Technologies [25] uses adaptive optics with
wavefront sensing and close loop control for a standoff system
with a work volume at 2 to 3 meters from the camera. Retica
reports that their multi-biometric system achieves 77% true
match rates at 4.5 meters on first attempt and 92% after three
attempts [26].
Keywords-Ocular biometrics; Standoff biometric system;
Noncooperative biometrics.
I.
INTRODUCTION
Biometric technologies have been implemented in many
application areas and are replacing traditional authentication
methods [1-3]. Ocular biometrics refers to the imaging and use
of characteristic features of the eyes for personal recognition.
The proliferation of ocular biometrics is based on its inherent
advantages [4, 5] and it is made possible by recent progress in
related technologies and processing algorithms [6-8].
Traditional face and fingerprint recognition may also be
augmented by additional biometric traits, such as ocular, for
more accuracy and security [9]. However, many challenges
remain, especially for iris image acquisition in unconstrained
conditions and without the necessary degree of user
cooperation [10-12]. Research teams and commercial
developers have responded by creating uni- and multi-modal
systems for real-world conditions [13, 14]. The goal is robust
performance in variable lighting conditions and subject-tocamera distances for moving subjects, off-angle images, and
other factors that generally diminish captured image quality.
During next section of this paper, we review some notable
long-range ocular biometric systems and describe their
important parameters and limiting factors. Section three
outlines the requirements of the front-end imaging system of a
standoff ocular biometric system. We describe key components
978-1-4673-2709-1/12/$31.00 ©2012 IEEE
ACQUISITION OF OCULAR BIOMETRIC TRAITS
B. Augmented Standoff Acquisition Systems
Most current ocular biometric systems are based on the
unique iris patterns of the human eye. Their performance
depends directly on the iris image quality, which is adversely
affected by distance. Recently, improvements using additional
ocular modalities have been investigated [9, 12, and 34].
Simultaneous acquisition of iris, vascular patterns on the white
of the eye, and periocular patterns may also reduce user
constrains or requirements for compulsory user cooperation.
144
average iris size of 10mm. Because of this, one customary
approach is to use a wide field-of-view camera to locate the
eyes in tandem with a second, narrow field-of-view camera for
imaging.
Localization of the eyes for subjects on the move is not a
trivial task either. Again, many systems use a camera with a
wide field of view to locate the face and subsequently the
eyes, and a high resolution, high-magnification camera for iris
capture. To cover a wider field of view, the high resolution
camera maybe mounted on a pan and tilt stand, and use a lens
with optical zoom (PTZ). In this case, the mechanical stability,
speed, and pointing accuracy of the PTZ system become
crucial. Even at only 1 meter standoff, 100 microns resolution
for the iris capture requires 100 micro-radians (0.006 degrees)
pointing stability [23].
Extreme standoff distances are limited by the governing
laws of light propagation and the capability and price of
current technology and components. In next section, we
describe the design and construction of a standoff imaging
system using system integration of commercially available
components with low to medium cost.
Periocular features maybe useful for long distance recognition,
however they are not as specific as those of iris or vasculature
seen on the white of the eye. The latter is especially amenable
to being captured at longer distances in visible spectrum and
with off-angle iris.
In an effort to extend the depth of field, another challenge
in standoff ocular biometrics, Boddeti and Kumar [27]
investigate the use of wavefront-coded imagery for iris
recognition. They conclude that wavefront coding could help
increase the depth of field of an iris recognition system by a
factor of four. McCloskey et al. [28] explore a “flutter shutter”
technique to acquire focused iris images from moving subjects
eliminating motion blur. Researchers have explored
“structured” light, visible spectrum, and imaging under
different wavelength illumination as opposed to the NIR range
(700 to 900 nm), which is typically used in commercial
systems. Ross et al. [29] investigate imaging with illumination
in the 950nm to 1650nm range at short distances. Grabowski
et al. [30] describe iris imaging that allows characterization of
structures in the iris tissue. He et al. [31] design their own
camera for iris capture using a CCD sensor with 0.48 M pixels
resolution and a custom lens with 250mm fixed focus. They
use LED light source at 800nm and NIR band-pass filter to
minimize specular reflections from the cornea of the eye.
III.
IV.
AN ACQUISITION PLATFORM FOR NONCOOPERATIVE,
LONG-RANGE OCULAR BIOMETRICS
The following describes our novel acquisition platform for
long-range ocular biometrics to image iris in NIR from
standoff distances up to 10 meters and possibly without
necessary cooperation from the subjects.
PARAMETERS AND REQUIREMENTS OF STANDOFF
OCULAR BIOMETRIC SYSTEMS
The acquisition of quality images is the most important
step in standoff ocular biometrics. There are specific
requirements and performance challenges to the image
capturing equipment. Proximity to the subject, illumination,
and viewing angle are among confounding variables. Moving
subjects have a limited residence time in the field of view and
within the imaging depth of field. Even with some degree of
cooperation, the orientation of face and eyes is not always
perfect. Image resolution decreases with increased distance.
The collected light by the lens aperture decreases in inverse
proportion to the square of the distance. Imaging with higher
f-number increases the depth of field but limits the amount of
collected photons and consequently, requires longer exposure
times (or increased illumination intensity). Imaging with long
exposure time is prone to motion blur.
In their comprehensive tutorial [32], Matey and Kennell
examine the requirements for iris capturing at distances greater
than one meter. The authors describe many relevant
parameters, including wavelength, type of light source and eye
safety, required characteristics of the lens and signal to noise
ratio, capture volume and subject’s residence time. Describing
the “Eye on the move” system, Matey summarizes the
requirements for a standoff iris imaging system [33]. He
indicates the need for approximately 100 microns resolution at
the eye (100 pixel across the iris); 40 dB signal to noise ratio
(SNR); 90 and 50 levels of intensity separation between irissclera and iris-pupil boundaries, respectively, for an 8-bit
imaging sensor. To cover a double door passing area, Matey
calculates that a system needs 150 mega-pixels sensor(s) to
achieve a 100-pixel resolution across the imaged iris, given an
A. Design Approach and Considerations
The original idea was to focus on development of the front
end optics and image sensor for long range acquisitions.
Furthermore, to relax the requirements for subject cooperation,
we investigated electronic and electromechanical components
required to locate a subject’s eyes, detect optimal gaze, and
acquire an image (or images) with sufficient quality. Our
design concept addressed the following three functional
aspects:
1. Eyes and iris localization with gaze detection and
alignment: using a PTZ mechanism, camera is to make eye
contact with the subjects (rather than the opposite, which
requires cooperative subjects):
• Customized Eyebox2 (XUUK™ Inc., Kingston, ON,
Canada) system for long-range person/face/eyes and gaze
detection;
• Ability to control simultaneously multiple iris
cameras for crowd scanning.
2. Use advance imaging techniques, especially lucky
imaging, for long range acquisition of the iris:
• High-magnification precision optics;
• NIR-enhanced high-speed image sensor in burst
mode (high frame rate);
• Real-time algorithms to evaluate image quality and to
select the best image from a sequence;
• Synchronized active illumination, and
3. Subject/eye tracking until a good quality image is
obtained.
145
has advantages such as a short minimum
m
focusing distance
(5m), ease of coupling with variety
y of digital cameras, and a
flexible camera adapter with directt access for imaging target
location and focus adjustment. In th
he final design, we selected
an eyepiece with a fixed power which makes placement of the
image sensor behind the eyepiecce less critical and, more
importantly, it produces less vignettting (darkening around the
edges of an image) compared to a ty
ypical zoom eyepiece.
Because of the need for high magnification
m
and to reduce
the effect of subject-camera moveements, a standoff system
requires short exposure times using
g “fast” lenses and sensitive
image sensors with larger pix
xels, plus proper active
illumination. Historically, NIR illumination is used to
facilitate the imaging of irises wiith dark pigmentation. For
multimodal ocular biometrics, visiible spectrum can be used
for periocular imaging and to imag
ge the vasculature seen on
white of the eye (wavelengths in the
t green bandwidth). NIR
illumination, however, does not evoke
e
the protective pupil
constriction reflex and thus the usee of extreme light intensity
for standoff system is limited. IR-A
A (near IR, 780 to 1400nm)
reaches the sensitive cells of the retina, and thus for high
irradiance sources the retina is at risk from acute exposures
from localized light sources. IR-B (mid IR, 1400 to 3000nm)
and IR-C (far IR, 3000nm to 1mm
m) present risk to both the
skin and the cornea from “flash burrns.” The heat deposited in
the cornea may be conducted to the eye’s lens and cause
clouding (albeit with very long exposure times). The IR
illumination intensity for short expo
osures should be limited to
The performance requirements for resoluttion are based on
the SNR and optical resolution of the acquuired signal. The
quality of images at a distance is related to thhe lens’ aperture,
residence time (frame exposure duration) and light intensity at
the imaging sensor. Practical usability and thhe need to avoid
blurring because of subject movements are additional
constrains. Maximum illumination is limited by eye safety
factors. Thus, compliance with ANSI/IESNA Standard RP27.1-96 and its testing methodology is criticaal. In late 2008, a
newer standard, referred to as IEC 62471-20006, was adopted.
It addresses the photo-biological safety oof lamps, lamp
systems, and specifically the safety of L
LED sources, as
applicable to ocular imaging illumination.
Design considerations have to include thee diffraction limit
of a lens and its dependence on the wavvelength and the
aperture diameter (optical resolution covvariates). These
requirements on optical system and image sensor are more
stringent in case of non-cooperative subjects.
B. Component Selection and Design Implemeentation
The focus areas, while implementing the standoff system
design, were: a) optics; b) illumination; c) im
maging sensor(s);
d) eye/gaze detection and tracking; and e) near real-time
ocular image quality metrics.
Optical front end is the most important paart of the imaging
system with three main components: long focaal length lens; an
eyepiece for additional magnification; and aan image sensor
(camera) attached to the optics. After conssidering different
telephoto lenses, we first used Infinity’s K2/S remote
microscope (Infinity, Boulder, CO, USA) coupled with a
digital camera (Fig. 1.a). K2/S is an excellennt optical system
with many advanced features. However, with its small
aperture and limited field of view, K2/S requires intense
illumination and pointing accuracy, even att short distances
(e.g. less than 3m). To obtain the necessary reesolution at up to,
we resorted to digiscoping as one of the sim
mplest, yet most
effective methods for high magnification imagging. Just as with
camera lenses, a high performance glass objecctive is needed to
produce the sharpest images with the best coolor reproduction
and optical resolution. For cost considerations, first we
explored the performance of Meade’s LX990-ACF (Meade
Instruments, Irvine, CA, USA) Schmidt-Casssegrain telescope
(Fig. 1.b), because of its large aperture and tthe intrinsic high
photon collection efficiency of a reflective teleescopes.
Εγ Δγ < 1.8 t3/4
(1)
where Eγ is the spectral irradiance in W/cm2, Δγ is the spectral
bandwidth in nm, and t is the exposure
e
time in seconds
(International Commission on Non-ionizing Radiation
Protection, ICNIRP 1996, 2000). We
W used a digital hand-held
multispectral intensity meter, PM
M100 with S120B sensor,
(ThorLabs, Newton, NJ, USA) to measure and monitor
illumination intensity levels, and also to evaluate the light
capturing capability of STS80 HD.
mercial light sources, we
After testing different comm
selected LIR850-25 (LDP LLC – MaxMax.com,
M
Carlstadt, NJ,
USA) with an effective range of up
u to 150 m (Fig 2). Each
source has 147 IR LED and can opeerate in a synchronous flash
mode with the camera.
Figure 1. Telescopes coupled with cameras for standooff ocular imaging
testing: a) Infinity K2/S; b) Meade LX90; and c) Swarrovski STS80 HD.
In parallel, we evaluated the high-definiition STS80 HD
spotting scope (Swarovski Optik North Am
merica, RI, USA)
with an 80mm objective lens and availabble selection of
magnifying and optical-zoom eyepieces (Fig. 1.c). The
objective lens focal length is 460mm and the weight is 1330g.
Besides typical limitations of refractive telescopes, STS80 HD
Figure 2. The system for standoff ocular im
maging with Swarovski STS80 HD
telescope and two LIR850--25 lifgt sources.
146
one person detected and “looking” (gaze
(
directed axially at the
XUUK camera). Fig. 5(d) demonsttrates detection of a person
looking at the camera from about a distance of 7.5 meters.
We tested several cameras with enhancedd NIR sensitivity.
The SMX-150M camera was selected becausee of the enhanced
NIR spectral characteristics of its IBIS5--AE-1300 image
sensor (Sumix, Oceanside, CA, USA). It hass 1.3-mega pixel
(1280 x 1024) sensor with high sensitivity inn 400 to 1000nm,
increased light collection quantum efficienccy due to larger
pixel size of 6.7 x 6.7µm, monochromaatic design, and
extensive real time control and visualizatioon using its PC
interface. The upper curve in Fig. 3 represennts the enhanced
NIR efficiency of the sensor and the vertical ddashed line marks
the 850nm wavelength of the LED light sourcce. The USB6009
hardware module from National Instrumennts (Austin, TX,
USA) was used to trigger the light sources and the camera
(USB-based digital and analog control and acquisition
module).
V.
IMAGE ACQUISITION AND SYSTEM PERFORMANCE
EVALUATIO
ON
The first image acquisition in
ncluded trial 13 volunteers,
under the auspices of the Institutional Research Board. The
acquisition included multiple bursts of five image sequences,
which were obtained from distancees of 0.75, 6 (Fig. 6), 7, 8,
and 9 meters (Fig. 7). Backgroun
nd NIR light was used for
manual focus and a synchronized electronic
e
NIR “flash” was
used for burst captures. In a second
d trial, the eye images from
an additional group of 15 volunteerss were acquired.
Figure 3. The SMX-150M camera (left) and the spectraal charachteristics of
its enhanced IBIS5-AE-1300 sensor (upper reed curve).
For eye localization and gaze detectionn, we used the
XUUK™ camera/light sources system (Fig. 4.a), which was
originally designed to detect and count tthe human eyes
looking at a target such as billboards. The appplication detects
gaze based on the red-eye effect. We useed the provided
development kit to write an application and reeport the relative
coordinates of detected face/eyes to pan-annd-tilt module to
point the mounted imaging system. We ussed PTU-D46-70
pan-and-tilt unit (Fig. 4.b) from Direccted Perception
(Burlingame, CA, USA). We selected this parrticular model for
its high-speed and accurate positioning of payyloads up to 9 lbs
at speeds up to 60 degrees/second and positiion resolution of
0.013 degrees. Application developed in LabbVIEW (National
Instruments) distributes the detected eyes cooordinates to the
networked pan and tilt controller using TCP/IP
P.
Figure 5. Performance of the XUUK™ cam
mera: no detection (a); detection of
a person/face (b); direct gaze detection at about 1m distance (c); and detection
of direct axial gaze at a distance off approximately 7.5m (d).
Figure 6. Images acquired at 6 meters with
hout (left) and with glasses (right).
Figure 4. The Eyebox2 XUUK™ camera (a) and PTU--D46-70 module (b).
Figure 7. Images acquired at 9 meters
m
standoff distance.
Fig. 5 displays the functionality of the X
XUUK™ system
locating a person, his eyes, and presence of diirect gaze. In Fig.
5(a), the person is not detected. In 5(b), the face is detected
and given an associated sequence number. Fiig. 5(c) indicates
The number of pixels per iris diameter decreases
proportionally with distance between the camera and subject.
We noted that 6m distance, the combined
c
magnification of
147
lens-eyepiece optical system results in a field of view
sufficient to image a single eye with more thaan 320 pixels per
iris diameter. At 9m distance, the average iriss image diameter
decreases to about 210 pixels.
All images were stored for compuutational quality
assessment and features extraction. A num
mber of quality
factors were implemented and used to selectt the best frames
from a series of burst shots taken at distances of 1 to 9 meters.
The images were first segmented and the low
wer part of the iris
region was used to compute the local quality m
measures for best
frame selection. Only correctly segmenteed frames were
Different quality
selected for computation of quality factors. D
measures were tested, including: gradiennt-based metrics
(Tenengrad, adaptive, separable and non-sepaarable Tenengrad,
Laplacian, Adaptive Laplacian); correlation--based measures
(autocorrelation function-single sample, area and height of
central peak of the correlation function); statistics-based
measures (absolute central moment, grey level variance,
Chebytchev moments/ratios, entropy, histogrram); transformbased measures (Fourier transform: coefficiennts & magnitude,
cosine transform, multivariate kurtosis, waveelets); and edgebased measures (step edge characteristics, transition width,
local kurtosis) [34]. Focus quality assessmeent was used for
best frame selection and to illustrate the qualiity variation with
distance. Fig. 8 displays the best achieved qquality scores at
different distances [35].
•
Main camera lens has to have a large aperture for
high photon collection and the image sensor needs high
quantum collection efficiency at the wavelength of
illumination (e.g. large pixel size);
•
The camera needs to haave high frame rate and
electronically controlled settings;
•
Ambient light introducees degradation of image
quality when NIR band-pass filter is not used. This is due to
the different focal planes of differen
nt wavelengths and/or glare
and specular reflection;
•
Overall, the optical fron
nt end has to have high
magnification and quality (e.g. low
w geometric distortion, fast,
large aperture, high photon transfeer, and sufficient depth of
field). It needs to be adapted for electronic focus, and
preferably, including a fast and accurate assessment of subject
distance;
ynchronized with the NIR
•
The sensor has to be sy
illumination sources considering intensity
i
levels within the
exposure safety requirements;
•
Pan-and-tilt system has to have the ability to respond
to reference coordinates with fasst vector movements and
motion stabilization. Alternative an
nd non-traditional pan and
tilt mechanisms (e.g. arc mounted
d or disk-based) could be
better suited for this application to accommodate the lens and
camera assembly and to operate witth less vibration;
•
A constellation of multiple distributed networked
imaging cameras may cover largerr work volume and acquire
better images based on control from one or more gaze
detection systems.
w
calculated and used to
A number of quality factors were
select the best frames from a seriees of burst shots of each of
the 28 subjects, taken from distan
nces of 1, 6, 7, 8, and 9
meters. The results demonstrate the degradation of image
quality with the increased stando
off distances. Inter-subject
comparison at the same distancee suggests that iris color
assessment and selective spectral illumination/imaging may
increase the image quality. This option is applicable to a
design with multiple networked cameras and illumination
sources.
The illumination intensity lev
vel and its bandwidth are
critical. Light source’s spatial locaation and direction are also
important. For example, the relatiive difference in the light
source location is a probable cause for image quality
differences in Fig. 8 for the left vs. right eyes. Axial alignment
with the camera produces specular reflections from the cornea
which may degrade performancce. On the other hand,
especially if located on the pupil, the
t specular reflection size
could be used for focus adjustment and even PSF calculations.
Future work will involve implementing a robust
segmentation algorithm that would
d perform better on images
acquired from different distances, using
u
iris images at different
scales, resolutions, locations and at various degradation levels.
Implementation of fast voting criteeria for the selection of the
best frame before and after segm
mentation is critical for the
performance of the system, and th
hus further development of
near real-time quality assessment is needed. Implementation of
Figure 8. Best combined quality score for different stanndoff distances using
our long-range aquision platform.
VI.
DISCUSSION AND CONCLUSSIONS
In this paper we reported the design coonsiderations and
implementation of a standoff image acquisition system for
ocular biometrics. We used system integration of
commercially available, off-the-shelf coomponents. We
described important and often conflicting requuirements for the
front end optical system.
Regarding hardware, we concluded that rred-eye detection
(e.g. by using XUUK™ system) can be succcessfully used to
detect gaze direction from localized distant faace and eyes. Eye
coordinates can be used to control a PTZ ssystem to reduce
subject cooperation requirements. We concludde that:
•
The embedded computing power iss critical for the
real time performance of the eye discoveery and tracking
algorithm;
148
[17] T. Boult and W. Scheirer, “Long-Range Facial Image Acquisition and
Quality” in Handbook of Remote Biometrics for Surveillance and
Security, Edited by Massimo Tistarelli, Stan Z. Li and Rama Chellappa,
Springer, pp. 169-192, 2009.
[18] H. Proenca, “Non-cooperative iris recognition: Issues and Trends.”, 19th
Europian Signal Processing Conference (EUSIPCO 2011), Barcelona,
Spain, August29 – September 2, 2011.
[19] H. Proenca, S. Filipe, R. Santos, J. Oliveira, and L. A. Alexandre, “The
UBIRIS.v2: A Database of Visible Wavelength Iris Images Captured
On-The-Move and At-A-Distance.”, IEEE Trans. on Pattern Analysis
and Machine Intelligence, in press.
[20] H. Proenca, “On the feasibility of the visible wavelength, at-a-distance
and on-the-move iris recognition.” IEEE Workshop on Computational
Intelligence in Biometrics: Theory, Algorithms, and Applications (CIB
2009) , 9-15, March 2009.
[21] F. W. Wheeler, A. G. Amitha Perera, G. Abramovich, B. Yu, and P. H.
Tu, “Stand-off Iris Recognition System”. IEEE 2nd International
Conference on Biometrics: Theory, Applications, and Systems (BTAS
08), Sept. 2008.
[22] W. Dong, Z. Sun, and T. Tan, “A Design of Iris Recognition System at a
Distance”. Chinese Conference on Pattern Recognition (CCPR 2009), 15, Nov. 2009.
[23] J. R. Matey, “Iris Recognition: On the Move, At a Distance, and Related
Technologies”, Sarnoff Corporation, Prinston, NJ, 08543
[24] Y. Du, N. L. Thomas, and E. Arslanturk, "Multi-level iris video image
thresholding," in IEEE Workshop on Computational Intelligence in
Biometrics:Theory, Algorithms, and Applications, 2009, pp. 38-45.
[25] Comprehansive Evaluation of Stand-Off Biometrics Techniques for
Enhanced Surveillance during Major Events
http://pubs.drdc.gc.ca/inbasket/mmgreene.110426_0911.DRDC_CSS_C
R-2011-08.pdf
[26] F. Bashir, P. Casaverde, D. Usher, and M. Friedman, “Eagle-eye: a
system for iris recognition at a distance”, 20008 IEEE Conference on
Technologies for Homeland Security, 12-13 May 2008, pp. 426-431.
[27] V. N. Boddeti and B.V.K. Kumar, “Extended-Depth-of-Field Iris
Recognition Using Unrestored Wavefront-Coded Imagery”. IEEE
Transactions on Systems, Man, and Cybernetics - Part A: Systems and
Humans 40 (3), May 2010, 495-508.
[28] S. McCloskey, A.W. Au, and J. Jelinek, “Iris capture from moving
subjects using a fluttering shutter”. Fourth IEEE International
Conference on Biometrics: Theory Applications and Systems (BTAS
10), Sept. 2010.
[29] A. Ross, R. Pasula, and L. Hornak, “Exploring multispectral iris
recognition beyond 900nm”. IEEE 3rd International Conference on
Biometrics: Theory, Applications, and Systems (BTAS 09), Sept. 2009.
[30] K. Grabowski, W. Sankowski, M. Zubert, and M. Napieralska, “Iris
structure acquisition method”. 16th International Conference Mixed
Design of Integrated Circuits and Systems (MIXDES ’09), June 2009,
640-643.
[31] X. He, J. Yan, G. Chen, and P. Shi, “Contactless Autofeedback Iris
Capture Design”. IEEE Transactions on Instrumentation and
Measurement 57 (7), 2008, 1369-1375.
[32] J.R. Matey and L.R. Kennell, “Iris Recognition -Beyond One Meter,
Handbook of Remote Biometrics, 2009.
[33] J. R. Matey, D. Ackerman, J. Bergen, and M. Tinker, ”Iris recognition in
less constrained environments”, Springer Advances in Biometrics:
Sensors, Algorithms and Systems, pp. 107–131, October, 2007.
[34] H. Bharadwaj, H.S. Bhatt, M. Vatsa, and R. Singh, “Periocular
biometrics: When iris recognition fails”. Fourth IEEE International
Conference on Biometrics: Theory Applications and Systems (BTAS
10), Sept. 2010.
[35] R. Derakhshani, P. Doynov, and B. Abidi, “An Acquisition Platform for
Non-cooperative, Long Range Ocular Biometrics”, Project report,
CITeR 2008.
an integrated camera and illumination source feedback control
in an embedded computational unit will make the system more
robust and easier to use. Future work also needs to address
illumination techniques (active, passive, and structured) for
noncooperative standoff biometric systems.
VII. ACKNOWLEDEMENTS
This work was supported in part by a grant from the Center
for Identification Technologies Research (CITeR). Research
was performed in cooperation with Dr. Besma Abidi,
University of Tennessee, Knoxville, TN. The authors thank
her for many useful discussions and the quality metrics
evaluation of the acquired images.
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]
[16]
A. K. Jain and A. Kumar, Second Generation Biometrics. Springer, Ch.
“Biometrics of Next Generation: An Overview”, 2010.
D. Bhattacharyya, R. Ranjan, P. Das, T. Kim, and S. K. Bandyopadhyay,
“Biometric Authentication Techniques and its Future Possibilities”,
Second International Conference on Computer and Electrical
Engineering, ICCEE '09, Volume: 2, pp. 652 – 655, 2009.
Q. Xiao, “Technology review - Biometrics-Technology, Application,
Challenge, and Computational Intelligence Solutions”, Computational
Intelligence Magazine, IEEE Volume: 2, pp. 5–25, 2007.
K. Bowyer, K. Hollingsworth, and P. Flynn. ”Image understanding for
iris biometrics: A survey”. Computer Vision and Image Understanding,
vol. 110, no. 2, pp. 281–307, 2008.2007.
A. K. Jain, A. Ross, and S. Prabhakar, “An introduction to biometric
recognition”, IEEE Transactions on Circuits and Systems for Video
Technology, Volume: 14, Issue: 1, pp. 4 – 20, 2004.
K. Bowyer, K. Hollingsworth, and P. Flynn, “Image understanding for
iris biometrics: A survey.”, Computer Vision and Image Understanding,
vol. 110, no. 2, pages 281–307, 2008.
J. G. Daugman, “New methods in iris recognition.”, IEEE Transactions
on Systems, Man, and Cybernetics - Part B: Cybernetics, vol. 37, no. 5,
pp. 1167–1175, 2007.
R. Derakhshani and A. Ross, “A Texture-Based Neural Network
Classifier for Biometric Identification using Ocular Surface
Vasculature”, International Joint Conference on Neural Networks, pp.
2982 – 2987, 2007.
L. Nadel and T. Cushing, “Eval-Ware: Biometrics Resources [Best of
the Web]”, Signal Processing Magazine, IEEE, Volume: 24, Issue: 6, pp.
136 – 139, 2007.
C. Fancourt, L. Bogoni, K. Hanna, Y. Guo, R. Wildes, N. Takahashi,
and U. Jain. “Iris recognition at a distance.”, In Proceedings of the 2005
IAPR Conference on Audio and Video Based Biometric Person
Authentication, pp. 1–13, U.S.A., July 2005.
K. Ricanek, M. Savvides, D. L. Woodard, and G. Dozier,
“Unconstrained Biometric Identification: Emerging Technologies”,
Computer, Volume: 43, Issue: 2, pp. 56 – 62, 2010.
A. Ross, “Recent Progress in Ocular and Face Biometrics: A CITeR
Perspective”, 2010, http://www.csee.wvu.edu/~ross.
H. Proenca and L.A. Alexandre, “Iris Segmentation Methodology for
Non-cooperative Recognition.”, IEEE Proc.—Vision, Image and Signal
Processing, vol. 153, pp. 199-205, 2006.
J. Ortega-Garcia et al., “The Multi-Scenario Multi-Environment
BioSecure Multimodal Database (BMDB),” IEEE Trans. on Pattern
Analysis and Machine Intelligence, 32(6), pp. 1097–1111, 2009.
J. Choi, G. H. Soehnel, B. E. Bagwell, K. R. Dixon, and D. V. Wick,
“Optical requirements withturbulence correction for long-range
biometrics”, in Optics and Photonics in Global Homeland Security V
and BiometricTechnology for Human Identification VI. Proceedings of
theSPIE, Volume 7306, pp. 730622-730622-11 (2009).
http://www.dodsbir.net/sitis/archives_display_topic.asp, SBIR Program
A10-100, “Standoff-Biometric for Non-Cooperative Moving Subjects.”
149