Keynote Speakers

From ICB2016
Jump to: navigation, search


ICB-2016
The 9th IAPR International Conference on Biometrics
June 13-16, 2016. Halmstad, Sweden

Technical sponsors
Iaprlogo.gif Ieeebclogo.gif
Hosted by
Hh-logo-2013-eng.png



Keynote Speakers




Click on the name or title for additional details


David Burnett: Internet-Scale Adoption of Biometrics: Analysis and Next Steps

Vice President for Global Ecosystem Development, Fingerprint Cards AB
@ Tuesday, June 14, 2016 (9:00 am - 10:00 am)
PDF of the presentation: here
Video: here


James Loudermilk: The FBI Fingerprint Program

Senior Level Technologist, FBI Science and Technology Branch
@ Wednesday, June 15, 2016 (9:00 am - 10:00 am)
PDF of the presentation: here
Video: here


Didier Meuwly: Forensic Biometrics: Quantifying Forensic Evidence from Biometric Traces

Principal Scientist, Netherlands Forensic Institute and Chair of Forensic Biometrics, University of Twente
@ Thursday, June 16, 2016 (9:00 am - 10:00 am)
PDF of the presentation: here
Video: here


John Daugman: Biometric Entropy: Searching for Doppelgängers and the rare Entropod Uniquorns

Professor of Computer Vision and Pattern Recognition, University of Cambridge
2016 IAPR Senior Biometrics Investigator Award (SBIA) Talk
@ Thursday, June 16, 2016 (10:30 am - 11:30 am)
PDF of the presentation: here
Video: here




Burnett.jpg


David Burnett

Vice President for Global Ecosystem Development

Fingerprint Cards AB


Title: Internet-Scale Adoption of Biometrics: Analysis and Next Steps

PDF of the presentation: here

Video: here


Abstract

While the last two years have seen significant adoption of biometrics in mobile devices, much work remains to realize their benefits and firmly cement their convenience and security benefits into every-day consumer use. In this talk, Mr. Burnett will explain the adoption pattern for biometric solutions at internet-scale, outline the missing pieces of infrastructure needed to make biometric authentication truly pervasive, and provide a multi-year forward look into biometric adoption trends for a wide range of device types, ecosystems and major milestones/industry turning points.


Biography

David Burnett is Fingerprint Cards’ Vice President for Global Ecosystem Development and focuses on accelerating the adoption of biometric authentication for new markets, platforms and use cases through his work with customers, partners and standards bodies. He has a unique perspective from being on the front lines of both evangelism and implementation of internet-scale mobile biometric authentication frameworks for cloud-based services. Previously while at Nok Nok Labs, Mr. Burnett played pivotal roles in founding and driving expansion of the FIDO Alliance and also enabling the first FIDO-enabled Android smartphone (Samsung Galaxy S5), the first FIDO-enabled payment service (PayPal), and first FIDO-enabled carrier (NTT DOCOMO). Underpinning his expertise in biometric authentication on mobile devices is a deep foundation in security and cryptography best practices developed during his leadership positions at both PGP Corporation and Symantec from 2002 through 2011. Mr. Burnett’s strong sense of how to productize and ship innovative technology stretches back more than 20 years, beginning with an independent software engineering consultancy he founded in 1992 and led for 10 years. Mr. Burnett is an alumnus of the Stanford Graduate School of Business and lives in the San Francisco Bay Area with his wife and two daughters.




Louderilk-James.jpg


James Loudermilk

Senior Level Technologist

FBI Science and Technology Branch


Title: The FBI Fingerprint Program

PDF of the presentation: here

Video: here


Abstract

Since 1924, the FBI has been the United States national repository for fingerprints and related criminal history data. At that time, 810,188 fingerprint records from the National Bureau of Criminal Identification and Leavenworth penitentiary were consolidated to form the nucleus of the FBI's files. Over the years, the size of our fingerprint files has grown and the demand for the program's services has steadily increased. Today, the FBI's master criminal fingerprint file contains the records of about 71.2 million individuals, while our civil file represents about an additional 39.5 million individuals. The civil file predominantly contains fingerprints of individuals who have served or are serving in the U.S. military or have been or are employed by the federal government.

For its first 75 years of existence, the processing of incoming fingerprint cards by the FBI was predominantly a manual, time consuming, labor intensive process. Fingerprint cards were mailed to the FBI for processing and a paper-based response was mailed back. It would take anywhere from weeks to months to process a fingerprint card. At the peak over 2,000 persons were engaged full-time in the fingerprint program.

Early on it was clear that technology was needed to improve fingerprint processing. Over the decades the FBI has conducted four successive technology insertion programs. The first was of electromechanical equipment. The next three have applied pattern matching algorithms and advances in computer processor and storage technology.

Today, an average of 220+ thousand tenprints is processed daily, with 99.6% identification rate and 0.103% false match rate. Average response times are managed trading off labor cost to outperform committed service rates. During FY2015 4.7 million criminal answer required transactions were processed with an 8 minute average response time; 18.8 million civil transactions with an average 91 minute response time; and 744 thousand rapid fingerprint searches with a 9 second response time.

The primary focus of the address will be on the four rounds of automation; the driving needs and results achieved.


Biography

James Loudermilk is the Senior Level Technologist for the FBI Science and Technology Branch. He focuses upon identification issues, especially biometrics, and frequently represents the FBI on these topics. He recently served as the Department of Justice co-chair of the Biometrics and Identity Management Subcommittee of the National Science and Technology Council. He continues to co-chair the interagency working group that has replaced the NSTC subcommittee. He is a member of the FBI Biometric Steering Committee and Institutional Review Board. He previously was chief engineer and deputy program manager for the $640 million Integrated Automated Fingerprint Identification System (IAFIS), which includes the national criminal history and fingerprint check systems. He also led the development of the National Instant Criminal Background Check System and call center which supports purchase eligibility checks for firearms and explosives. He has also served as FBI Chief IT Architect, Chief IT Strategist, Deputy Chief Technology Officer, and Assistant Director of the IT Operations Division. Before entering the civil service, Mr. Loudermilk was in the private sector for over twenty years, holding various executive positions, in system engineering, software development, program management, logistics, and was a divisional CIO. Loudermilk holds Bachelor's and Master's Degrees in Mathematics from the University of Dayton and the Degree of Applied Scientist, in Communications Engineering, from the George Washington University. He is a graduate of the U.S. Army Command and General Staff College.




Dmeuwly.jpg


Didier Meuwly

Principal Scientist, Netherlands Forensic Institute

Chair of Forensic Biometrics, University of Twente


Title: Forensic Biometrics: Quantifying Forensic Evidence from Biometric Traces

PDF of the presentation: here

Video: here


Abstract

Outline

The talk will begin with a short introduction of the Netherlands Forensic Institute (NFI), its tasks, its organisation, its requesters and the role of forensic biometrics within the Institute. Then it will concentrate on the definition of forensic biometrics, the description of the informative value of the different biometric modalities in a forensic context and cover the different forensic applications of biometric technology using operational examples. Then, the validation of forensic evaluation methods used to assess the strength of evidence will be presented in detail. Finally, the talk will conclude with a short overview of some current topics of research in forensic biometrics within the NFI.

Definitions

Forensic science has an object of study: the crime and its traces. Traces are the most elementary pieces of information that result from crime. They are silent witnesses that need to be detected, analysed, and understood to make reasonable inferences about criminal phenomena (forensic intelligence), as well as for forensic investigation and court purposes.

Biometric recognition is the human-based and computerised recognition of individuals, based on their physical/biological/chemical and behavioural characteristics. Computerised approaches consist of feature extraction and feature comparison algorithms, as well as methods for the inference of identity of source. Biometric recognition allows to differentiate between human beings and to recognise them to a certain degree, depending on the modality, application and quality of the data (trace and reference specimens).

Forensic biometrics is defined as the application of human-based and computer-assisted biometric recognition methods and technologies to analyse biometric traces and reference specimens, in order to answer questions about the origin of these traces (source level inference). The examination of biometric traces can also answer other forensically relevant questions, about the activity that led to a trace (activity level inference) and whether this activity is constitutive of a criminal offence (offence level inference).

Modalities

In a forensic context, traces like biological traces, fingermarks, earmarks or bitemarks are physically collected on crime scenes. Some others, like face and body images, voice recordings, gait recordings or fingerprints and iris scanned for authentication are digitised with capture devices. Finally, some traces only exist digitally, like keystroke- and touchscreen- dynamics. Some modalities provide traces within physical crimes, some others within cybercrime or post-mortem individualisation. Finally some modalities like retina vein patterns do not provide any forensic trace.

Applications

A short movie describing a robbery is presented as an example to introduce 3 forensic applications in which biometric recognition play a role: forensic intelligence, forensic investigation and forensic evaluation. Forensic intelligence consists in linking criminal cases together; it is introduced using the fingermark and DNA modalities as examples.

Forensic investigation consists of selecting shortlists of candidates that are potentially donors of traces in criminal cases; it is described in detail using the fingermark and face modalities as examples.

Forensic evaluation focuses on the description of the strength of evidence that an individual is the donor of a trace in a criminal case. The statistical methodology used to describe this strength of the evidence and assign likelihood ratios is explained using the body height (human-based approach) and the speaker recognition (computer-based approach) modality.

Biometric recognition can also be used to reach a decision of identity verification of suspects or a decision of identification (closed-set or open-set) of victims in the context of disaster victim identification (DVI).

Validation

In the last decade, the forensic biometric research has developed computer-assisted methods for the analysis, comparison and evaluation of the evidence to support the forensic practitioners in their quest for more objective methods to report likelihood ratios. According to the EU council framework decision 2009/905/JHA, the forensic service providers carrying out laboratory activities like forensic evaluation of DNA and fingermarks/prints need to be accredited since 2015, for example under the ISO/IEC 17025:2005 standard — General requirements for the competence of testing and calibration laboratories. As a consequence of this EU decision the human-based and computer-assisted methods used for the forensic evaluation of fingermarks/prints need to be validated. Within forensic science, guidelines exist for the validation of the human-based methods used for forensic evaluation. They mainly focus on the education and the competence assessment of the practitioners and therefore are not suitable for the validation of computer-assisted methods.

Methods for the validation of computer-assisted methods have been developed more recently. They are being published and there is an incentive to integrate them in the ISO/IEC 19795:2012 standard — Information technology — Biometric performance testing and reporting — Part 6: Testing methodologies for operational evaluation. The validation strategy is using primary and secondary performance characteristics, identified as relevant to describe the performance and the limits of likelihood ratio methods and related performance metrics are used to measure them. The delicate question of setting validation criteria in a completely new context, in which no baseline exists, will also be discussed.


Biography

Didier Meuwly is born in 1968 in Fribourg, Switzerland. After a classical education (Latin/Philosophy), he educated as a criminalist and criminologist (1993) and obtained his PhD (2000) at the School of Forensic Science (IPS) of the University of Lausanne. Currently he shares his time between the Forensic Institute of the Ministry of Security and Justice of the Netherlands (Netherlands Forensic Institute), where he is a principal scientist, and the University of Twente, where he holds the chair of Forensic Biometrics. He specialises into the automation and validation of the probabilistic evaluation of forensic evidence, and more particularly of biometric traces. He was previously the leader of a project about the probabilistic evaluation of fingermark evidence, and responsible of the fingerprint section within the NFI. From 2002 to 2004, he worked as a senior forensic scientist within the R&D department of the Forensic Science Service (UK-FSS), at the time an executive agency of the British Home Office. From 1999 to 2002 he was responsible of the biometric research group of the IPS. He is a founding member of 2 working groups of the European Network of Forensic Science Institutes (ENFSI): the Forensic Speech and Audio Analysis Working Group (FSAAWG) in 1997 and the European Fingerprint Working Group (EFPWG) in 2000. He is still active within the EFWPG. He is also a member of the editorial board and a guest editor of Forensic Science International (FSI).




Daugman college.jpg


John Daugman

Professor of Computer Vision and Pattern Recognition, University of Cambridge

2016 IAPR Senior Biometrics Investigator Award (SBIA) Talk


Title: Biometric Entropy: Searching for Doppelgängers and the rare Entropod Uniquorns

PDF of the presentation: here

Video: here


Abstract

Entropy is the origin of collision avoidance (resistance to False Matches) in biometric systems. It quantifies the random variation of features across a population, reflects their dependencies and predictabilities, and determines the population sizes in which biometric identification can work. It is therefore surprising that this core concept from Information Theory has played little role yet in biometric research.

This talk explores biometric entropy within the face and iris modalities, meeting along the way Doppelgängers and the rare, newly discovered creatures, Entropod Uniquorns. A method for estimating biometric entropy is discussed. The intrinsic "channel capacity" of the IrisCode (taking into account its internal correlations) is measured at 0.566 bits of entropy per bit encoded, of which 0.469 bits of entropy per bit is encoded from natural iris images. The difference between these two rates reflects the existence of anatomical correlations within a natural iris, absent in "white noise" iris images.

The relative narrowness of IrisCode impostors distributions is a reflection of the high entropy, and it makes all different eyes roughly equidistant from each other. This desirable property is related to the epigenetic nature of iris patterns. The universality of the IrisCode impostors distribution was examined by generating 316,250 entire distributions of impostor scores, each distribution obtained by comparing one iris against hundreds of thousands of others in a database of persons spanning 152 nationalities. These comparisons (totalling 100 billion) showed that the IrisCode impostors distribution is remarkably universal, with only rare and small variations in mean or standard deviation. The talk concludes with implications of these issues for search strategies, including "1-to-many" and "1-to-first", as well as some new applications.


Biography

John Daugman received his degrees at Harvard University and then taught at Harvard before coming to Cambridge University, where he is Professor of Computer Vision and Pattern Recognition. He has held the Johann Bernoulli Chair of Mathematics and Informatics at the University of Groningen, and the Toshiba Endowed Chair at the Tokyo Institute of Technology. His areas of research and teaching at Cambridge include computer vision, information theory, neuro computing and statistical pattern recognition. Awards for his work in science and technology include the Information Technology Award and Medal of the British Computer Society, the "Time 100" Innovators Award, and the OBE, Order of the British Empire. He has been elected to Fellowships of: the Royal Academy of Engineering; the US National Academy of Inventors; the Institute of Mathematics and its Applications; the International Association for Pattern Recognition; and the British Computer Society. He was one of three finalists for the European Inventor of the Year Award, and he has been inducted into the US National Inventors Hall of Fame. He is the founder and benefactor of the Cambridge Chrysalis Trust.



Center for Applied Intelligent Systems Research (IS-Lab/CAISR), School of Information Technology, Halmstad University, Sweden