Private biometrics
A form of biometrics, also called Biometric Encryption or BioCryptics, in which the prover is protected against the misuse of template data by a dishonest verifier.
Biometric identification requires that a verifier searches for matches in a data base that contains data about the entire population. This introduces the security and privacy threat that the verifier who steals biometric templates from some (or even all) persons in the data base can perform impersonation attacks. When a private verification system is used on a large scale, the reference data base has to be made available to many different verifiers, who, in general, cannot be trusted. Information stolen from a data base can be misused to construct artificial biometrics to impersonate people. Creation of artificial biometrics is possible even if only part of the template is available.
To develop an insight in the security aspects of biometrics, one can distinguish between verification and private verification. In a typical verification situation, access to the reference template allows a malicious verifier to artificially construct measurement data that will pass the verification test, even if the prover has never exposed herself to a biometric measurement after the enrollment.
In private verification, the reference data should not leak relevant information to allow the verifier to (effectively) construct valid measurement data. Such protection is common practice for storage of computer passwords. When a computer verifies a password, it does not compare the password typed by the user with a stored reference copy. Instead, the password is processed by a cryptographic one-way function F and the outcome is compared against a locally stored reference string F(y ). So y is only temporarily available on the system hardware, and no stored data allows calculation of y. This prevents attacks from the inside by stealing unencrypted or decryptable secrets.
Comparison with handling computer passwords
The main difference between password checking and biometric private verification is that during biometric measurements it is unavoidable that noise or other aberrations occur. Noisy measurement data are quantized into discrete values before these can be processed by any cryptographic function. Due to external noise, the outcome of the quantization may differ from experiment to experiment. In particular if one of the biometric parameters has a value close to a quantization threshold, minor amounts of noise can change the outcome. Minor changes at the input of a cryptographic function are amplified and the outcome will bear no resemblance to the expected outcome. This property, commonly referred to as ‘confusion’ and ‘diffusion’, makes it less trivial to use biometric data as input to a cryptographic function. The notion of near matches or distance between enrollment and operational measurements vanishes after encryption or any other cryptographically strong operation. Hence, the comparison of measured data with reference data can not be executed in the encrypted domain without prior precautions to contain the effect of noise.
Meanwhile, it is important to realize that protection of the reference data stored in a database is not a complete solution to the above-mentioned threats. After having had an opportunity to measure operational biometric data, a dishonest verifier uses these measurement data. This can happen without anyone noticing it: Victor grabs the fingerprint image left behind on a sensor. This corresponds to grabbing all keystrokes including the plain passwords typed by a user.
References
Jeroen Breebaart, Christoph Busch, Justine Grave, Els Kindt: A Reference Architecture for Biometric Template Protection based on Pseudo Identities. In Arslan Brömme, Christoph Busch, Detlef Hühnlein (Eds.): BIOSIG 2008, 2008, pages 25-37, Lecture Notes in Informatics 137, Gesellschaft für Informatik, http://www.jeroenbreebaart.com/papers/biosig/biosig2008.pdf.
Ileana Buhan, Emile Kelkboom, Koen Simoens: A Survey of the Security and Privacy Measures for Anonymous Biometric Authentication Systems. International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP 2010), 2010, IEEE Computer Society, http://www.cosic.esat.kuleuven.be/publications/article-1462.pdf.
Ann Cavoukian, Alex Stoianov: Biometric Encryption: A Positive-Sum Technology that Achieves Strong Authentication, Security and Privacy. Discussion paper of the Office of the Information and Privacy Commissioner of Ontario, 2007, http://www.ipc.on.ca/images/Resources/bio-encryp.pdf.
Ann Cavoukian, Alex Stoianov: Biometric Encryption: The New Breed of Untraceable Biometrics. In: Nikolaos V. Boulgouris, Konstantinos N. Plataniotis, Evangelia Micheli-Tzanakou (Eds.): Biometrics: Theory, Methods, and Applications, 2009, John Wiley & Sons, Inc., Hoboken, NJ, USA, pages 655-710, ISBN 978-0-470-24782-2.
Ari Juels and Martin Wattenberg. A fuzzy commitment scheme. In ACM Conference on Computer and Communications Security, pages 28–36, 1999.
Pim Tuyls, Boris Skoric, Tom Kevenaar (Editors), Security with Noisy Data: Private Biometrics, Secure Key Storage and Anti-Counterfeiting (Hardcover), Springer, 2007, ISBN 978-1-84628-983-5.
Jean-Paul Linnartz and Pim Tuyls, New Shielding functions to enhance privacy and prevent misuse of biometric templates, 4th International Conference on Audio and Video Based Biometric Person Authentication, Guildford, United Kingdom, 9-11 June 2003.
White paper Private Identity Matching, http://www.priv-id.com/images/Technology-primer.pdf.