Technology has evolved in complex and dynamic ways especially when it comes to the manner in which we communicate with one another. At the heart of this, is a rich and growing field called biometrics that is redefining the way in which we identify and recognise each other. When we talk about biometrics in the modern context, we either refer to technology that we use in our day-to-day lives such as the fingerprint sensor (and now, the iris sensor) on our phones, or a futuristic scenario that involves the face recognition mechanism as seen in the Tom Cruise sci-fi flick, Minority Report. But, what exactly does it mean? At its most basic level, biometrics is exactly what it means when the word is broken down – the word ‘bio’ means life and ‘metrics’ means measurement. Any biological or physical marker that can be used to establish a person’s identity qualifies to be called as a biometric. In fact, the practice of biometrics can be traced to a long time back in history.
How the French did it
In the late 1800s, Alphonse Bertillon, a French police officer was said to have been frustrated in the manner in which data of captured criminals was recorded by the police. He came up with a method called Anthropometrique, which was a set of body measurements that reportedly had a failure rate of only one in 286,435,456. Law enforcement agencies around the world were quick to adopt this system until its flaws were soon discovered.
Because of the inaccuracy of the instruments at the time, measurements of the same person tended to differ based on the person taking the measurements. Moreover, there were instances wherein two or more people could have similar or almost identical measurements. In response, Alphonse Bertillon went on to use and pioneer the more effective parallel biometric system – the fingerprint identification system. In fact, Bertillon is also credited to have explored other biometric identification methods such as the iris recognition at the time, but only in theory.
|Alphonse Bertillon has been referenced a couple of times in Sherlock Holmes. In The Hound of the Baskervilles, a client refers to Holmes as the “second highest expert in Europe” after Bertillon. In another story, the Naval Treaty, Holmes himself expresses his admiration for the Bertillon system of anthropometrics|
It is rather difficult to pinpoint when exactly the fingerprint identification system came to be as the standard in the world of biometrics. A reason for this is said to be the difference in the timeline of this system’s use as a unique anthropomorphic marker and as a means of identification, especially in law enforcement. The 1800s saw several instances of fingerprints being used as a means of authentication in law enforcement and governance around the world.
In 1880, Dr Henry Faulds published a paper in Nature talking about the use of fingerprints as a means of identification of criminals. It was during his visit to Tokyo, where he is said to have noticed fingerprints on pottery, that prompted him to think that something similar could be used for authentication and identification. In the late 1870s, William Herschel in colonial India established the use of fingerprints (handprints, in this case) on government documents to prevent forgery. In 1890, Sir Francis Galton, who was well-versed with Faulds’ research, also made significant progress in this field by establishing the ridge as a unique characteristic in the fingerprint.
In 1896, Sir Edward Henry, Inspector General of the Bengal Police, who was said to be looking for a replacement for the failing anthropometries as a method of identifying criminals, consulted Sir Francis Galton. After its implementation, one of Henry’s workers, Azizul Haque, created a process to classify and store the prints in easy and accurate searches. The Henry Classification System, as it came to be known, was gradually adopted around the world and is said to be the precursor to the modern fingerprinting systems.
The credit as the father(s) of modern face recognition goes to Woody Bledsoe, Helen Chan Wolf, and Charles Bisson. Their manual face recognition system was capable, with moderate accuracy, of recognising a matching photo after it was provided with an input. However, a downside to this approach was that markers had to be defined manually. The next breakthrough came in the late 1980s with Eigenfaces, a method that applied linear algebra to face recognition by converting images to their low-dimensional, featureless representations. This even led to the first automated facial recognition systems.
Most of the next decade in facial recognition was based on refining existing systems with larger data sets and better resolution cameras. But still, the existing systems weren’t good enough for recognizing faces in crowded situations. That changed, along with other improvements, through research at some of the world’s leading universities in the US and the rest of the world. Also, thanks to the adoption of newer features such 3D face scanning, error rates have consistently dropped by half every two years. Today, face recognition along with 3D face scanning, is also easily accessible through consumer gadgets, such as the iPhone X that comes with the FaceID feature and the Windows Hello feature on many Microsoft and Windows-based devices.
An aye for an eye
While the development of iris recognition technology in the field of biometrics is fairly recent, the idea that the iris could be used for recognition dates as far back as the 1930s. The idea is said to have been first proposed by Frank Burch, an eye specialist, in 1939. But it was John Daugman’s patents in 1994 that still form the basis of most modern day iris recognition systems. The patent also included computer vision algorithms that form the basis for image processing, feature extraction, and matching.
While iris recognition is said to be making its way into modern consumer devices, one of the most recent implementations on the Samsung Galaxy S8 and S8+ was reportedly beaten by something as elementary as a sufficiently high-resolution image of the user’s face. The Government of India, on the other hand, is also said to be forming one of the largest databases of biometric data, that includes iris scans, in the form of its Aadhaar citizen identity program.
While most authentication mechanisms, many of which we’ve mentioned in the article, measure a part of the human body, there is a new and emerging field that involves measuring human behaviour. Behavioural biometrics or behaviometrics involves the analysis of human behaviour as an authentication mechanism. Sounds confusing? You are probably already quite familiar with voice-recognition based authentication systems which form the most prominent example of behaviometrics today. Each of us spends a significant amount of time every day with our gadgets. And as it happens, we use these gadgets in a particularly unique way, which can be effectively analysed and recognised. Using machine learning to execute pattern recognition, devices can now lock out fraudulent users if they detect deviant behaviour. For a workstation, a combination of keystroke dynamics, mouse usage, GUI interactions and behavioural patterns can form something akin to a virtual fingerprint.
In a heartbeat
In the future, we might see the field of biometrics get even more invasive than what it is today. Researchers at the University of Buffalo are said to have conceptualized a ‘cardiac’ scanner that will be able to recognise a person by the shape, size and beating of his heart from as far as 500m. Current technology is also expected to get more precise and convenient in the future. You can expect to see fingerprint scanners, in which one has to simply wave a hand over it, become quite commonplace. Seeing such drastic strides in technology and especially in the field of biometrics in the near future, an important question now arises – would it be okay for huge corporations and government agencies to record and store information about you that is as personal as even the beating of your heart?