Biometric Computer Security Technology Developments

Introduction
This report examines current developments in biometric technology.  Biometric technology represents the latest concept in the identification and authentication (I&A) aspect of computer and network security.  Many biometric technologies have been developed and are currently under scrutiny for wide-scale practical deployment.  These include fingerprints, iridic scans, retinal scans, facial recognition, voiceprint analysis, biometric implants, and vein authentication system.  After a brief introduction to the fundamental architecture and design issues, each of these technologies is investigated from the perspective of results already established; ongoing research and development efforts; and problems that remain unresolved.
This paper will establish the current challenges and opportunities in the research on biometric systems. Undoubtedly, technological products are becoming part and parcel of every day tools for individuals and organizations. In these regard, biometric systems, just like other technological products will have more prominence in the near future since individuals will try to find more secure ways of accessing and storing information. While these developments are beneficial, it is equally important to identify the cultural and social impact of these developments.
History of Identification and Authentication
Computers for “public” use first took the form of large mainframes, manned by teams of operators, to which users submitted distinct “jobs” encoded on punch cards.  There was no notion of separating or protecting system data, such as files, by user: the only “security” measures in place were present to control access to limited system resources, such as magnetic core memory or disk sectors.  When timesharing systems become popular in the 1970s, the notion of users maintaining their own private domains of files and programs and “logging in” to the system to access those domains became prevalent.  The technology of choice—indeed, the only available technology—was that of the password.  Passwords were stored—in cleartext, albeit—in a central database, and the candidate password entered by the user was verified against this database.  The UNIX system pioneered the concept of storing passwords in encrypted form and comparing the encrypted representation of the user’s candidate password to the canonical encrypted value in order to determine whether to grant or deny login access (Hiscott 2013).
While computer viruses and malware are as old as personal computers—dating back to the release of the first DOS systems in the late 1970s—it was not until the mid-1990s that weaknesses in password technology became readily apparent.  This observation coincided with the rise of a public internet, immediately and flexibly accessible through a World Wide Web of billions of servers distributed globally, that presented attractive targets to hackers, organized criminals, and even rogue nation-states.  The capability was present to endeavor to break into users’ private files and so access their bank accounts, credit card numbers, etc.  Early computer security studies determined that the very principle underlying the password, which “something you know,” is fundamentally weak.  Persons surreptitiously “shoulder-surfing” when a user types his password can easily steal the secret string.  Experiments in automated assignment of “random, pronounceable passwords” were conducted, but they were so difficult to remember that users would often write them down, enabling others to steal them.  Users can also be induced—whether willingly or under coercion—to share their passwords with other persons, obviating the very raison d’être of password-based I&A (Hiscott 2013).
The notion of replacing or augmenting “something you know”—a password—with “something you have”—such as a “chip card,” or smart card containing a computer chip—was experimented with as early as the late 1980s.  In fact, many European governments established pilot programs whereby citizens were given chip cards that enabled them to access experimental public networks.  Unfortunately, while chip cards represented a clear step forward in technological sophistication, they failed to address the fundamental problem.  Chip cards could still be shared or stolen, enabling unintended, possibly malicious, users to access the system and, moreover, frame legitimate users for their maleficent actions.  While some vendors fielded systems that required a combination of a password and a smart card—demanding “something you know” plus “something you have”—this, too, failed to redress the foundational weakness of the concept (Ardiley 2012).
Biometric Technology
Introducing Biometrics
The next concept that arrived on the scene was that of biometrics, or “something you are,” that is, physical parameters of users’ bodies.  While passwords can be shared and smart cards can be stolen, it is, for the most part, infeasible to appropriate bodily parts.  Obvious examples include fingerprints, footprints, voiceprints, iridic scans, retinal scans, and facial recognition.  Assuming that the considerable technical challenges can be overcome—the more sophisticated and foolproof an I&A technology is, the more difficult it is to design and engineer—these technologies harbor great promise.  They are also highly infeasible to defeat.  Admittedly, there are extreme circumstances in which they could be defeated: for example, if John’s fingerprint affords him access to a military computer system safeguarding highly classified defense intelligence files, enemy agents could conceivably amputate John’s finger and use it to obtain unlawful access.  However, such examples are so extreme—albeit, unfortunately, chillingly real—that they can be all but dismissed from practical consideration.
User Perceptions
Biometrics offer the promise of convenience.  For example, a user need no longer remember a suite of passwords in order to access various systems, networks, and applications if they can be set up to rely upon a uniform biometric I&A mechanism.  There are three problems that immediately present, however.  The first is whether engaging with the biometric technology is socially or culturally acceptable.  For example, in the United States, footprints have been taken of every newborn infant for many decades.  Footprints are unique, and infant footprints can be algorithmically extrapolated to their adult equivalents, or adults’ footprints could be taken.  However, many users in the Western world would likely find it culturally unacceptable—as well as annoying—to have to remove their footwear and hose and then plant their bare foot on an electronic sensor panel.  The second problem is whether the user finds interaction with the biometric mechanism to be embarrassing.  For example, some persons have psychological hang-ups that render them extremely reluctant to be photographed or measured in any way, shape, or form.  These persons obviously would be unable to access computer systems that are protected by biometric safeguards.  A final problem is that some persons do not possess the requisite body parts, whether by birth defect or as a result of an accident.  For example, biometric techniques that examine the right index finger cannot be used by people who are missing that finger.  While some of these objections are more readily dismissed than others, the fact is that it is extraordinarily challenging to develop a “one-size-fits-all” technique that is ready for deployment throughout vast communities of computer users (Riley et al. 2009).
Recent experiments have also taught much about how users perceive biometric technology, specifically, whether they find it to constitute an unacceptably invasive artifact of an ever more overpowering “Big Brother.”  Recently, a private company in Wisconsin implemented a system in which their employees could access their corporate computer systems and networks by means of a tiny electronic component painlessly implanted in the brawn of the thumb.  While many users leapt at the opportunity to forego the intellectual burden of remembering a password and the untenable manual labor and tedium of typing it in, many other users rebelled outright at the notion of a private company—let alone a governmental authority—requiring them to obtain physical implants (Bowerman 2017).  There are also underlying religious objections to such measures that cannot be legislated around, at least, not in a manner consistent with United States law (Riley et al. 2009).  For example, it is common knowledge that observant Jews are prohibited from any bodily modifications, including such unobtrusive artifacts as pierced ears and tattoos.  By rather obvious extension, they are clearly forbidden from having electronic components implanted beneath their skin.
Technical Issues
Having addressed many of the physical, cultural, religious, and philosophical objections to biometric technology, there remain purely technical issues that remain unsolved.  Biometric I&A techniques require sophisticated new hardware and intricate new software to be added to computer systems and networks.  A biometric device, such as a fingerprint scanner, incorporates a hardware component known as a controller that is directed by software known as a device driver.  This device driver must be integrated into the operating system in a manner conformant to the most rigorous software engineering principles.  Ideally, the computer’s operating system should provide a “trusted path” between itself and the biometric device driver: otherwise, the biometric technology is altogether moot.  For example, suppose an amazingly sophisticated facial recognition device is able to discern between authorized and unauthorized user communities with one hundred percent accuracy.  Suppose, also, that it sends a one-bit message to the operating system, sciz., 1 if the user is authorized and 0 if the user is not authorized.  If a sophisticated attacker can somehow subvert the communication path between the biometric device and the operating system—changing a 0 to a 1—the expensive, highbrow facial recognition module might as well not even be installed.  For a more mundane, readily appreciated example, suppose a uniformed officer requests identification from a prospective visitor to a military base and determines that he is unauthorized.  If the malefactor is able to, e.g., render the officer unconscious; commandeer his walkie-talkie; and transmit the appropriate protocol—say, “Guest identified; ten-four”—to the visitor control center (VCC), the first line of defense will thereby immediately have been circumvented (Anderson & Vaughn 1991, p. 16).
There is also a considerable administrative burden associated with implementing biometric I&A.  On a personal computer outfitted with, e.g., a fingerprint scanner, it is a simple matter to enroll an authorized individual or, perhaps, his family.  However, in a larger installation, such as a corporate data center, there is an obvious need to enroll hundreds or even thousands of users.  Passwords can be typed by users at their own workstations, whereas fingerprint registration obviously requires that the user physically visit the computer center and submit to an initial fingerprint scan.  This entails time, manpower resources, and careful logistical considerations.
Sample Biometric Technologies
Fingerprint Recognition
One example of a biometric technology that is particularly well fielded is fingerprint scanning and recognition.  Fingerprints are well understood and have been used by police officers for more than a century to identify culprits, although the idea of a national or global database of fingerprints—exemplified by the Federal Bureau of Identification’s (FBI) Automated Fingerprint Identification System (AFIS)—has only been entertained, let alone realized, for the last twenty years or so.  Most people are unaware that—quite surprisingly—the very concept of fingerprinting was proposed by the American writer, Mark Twain, in his comic 1894 novella, Pudd’nhead Wilson.
Fingerprint scanners rely a physical component—essential a camera—that captures an image of the fingerprint and sophisticated software that analyzes the image.  The first problem to be solved is that of alignment.  Matching a newly captured image against a canonical image already stored in a database of authorized fingertips requires that the images be aligned properly, that is, that the fingertips occupy the same area of the field and are pointed in the same direction.  After this has been achieved by means of a suitable algorithm, the next problem to be solved is to characterize the fingerprint.  After this has been achieved, there are two means by which the image can be analyzed: geometric pattern recognition and syntactic pattern recognition.
The first of these, geometric pattern recognition, regards the fingerprint image simply as several megabytes’ worth of pixels.  Matrix-based mathematical algorithms, such as template matching, are then applied to try to match this image against a set of candidate images stored on a well-protected security server.  This technique is extremely robust, however, it is often infeasible to transmit megabytes’ worth of data back and forth in order to facilitate such computations: intolerable processing delays could ensue, considerably encumbering users’ timely access to the system.
The alternative approach is syntactic pattern recognition.  In this scenario, the image is first sharpened by applying Laplacian-based filters, then analyzed using edge-detection and feature extraction algorithms.  These algorithms serve to reduce the huge image to a workable set of features, formally termed minutiae.  Minutiae include ridges, valleys, and whorls, the chief features that enable human fingerprint analysis visually to compare the fingerprints collected at a crime scene against those of a small, pre-selected set of likely suspects with rapidity and accuracy.  Once these minutiae have been identified, a picture description language (PDL) is used to derive a syntactic description that is amenable to computer processing and quick comparison.  A trivial example of a fingerprint specification in PDL might be RRV/WW, meaning “ridge next to ridge next to valley, all atop two adjacent whorls.”  In actual practice, a workable description of a fingerprint is far more complex, perhaps resembling the structural diagram of a chemical compound, which shows the arrangements of various atoms and the individual bonds that interconnect them.
Subban and Mankame (2013, p. 209) recently identified an improved fingerprint matching technique that combines both the geometric and syntactic pattern recognition approaches and eliminates the need for complex PDL specifications.  Their algorithm defines a broader set of minutiae, sciz., plain versus tented arch; radial versus ulnar loop; central pocket loop; plain whorl; double loop whorl; and accidental whorl.  Identifying a discrete set of these features without the need to construct a detailed specification of how they abut with and interconnect to one another proved sufficient to yield extremely reliable fingerprint matching results.
Yadav et al. (2015, p. 39) conducted a study in which they surveyed a wide array of fingerprint analysis and matching systems in detail and reached some important conclusions.  They noted that there are cases in which current technology leads to situations wherein fingers from two different persons are declared a match.  However, in such rare cases, the likelihood that other fingers from these two persons will also match is even rarer.  For example, if the likelihood of one of John’s fingerprints matching one of Bill’s is ten thousand to one, then the odds that two of John’s will match two of Bill’s is one hundred million to one.  They also determined that the availability of cloud computing and high-speed data channels can greatly speed up the fingerprint recognition process, which, they found, is markedly superior to, e.g., retinal image comparison given the current state of the art.
Current Situations in Innovation and Technology of Biometrics
Opportunities and Future of Biometrics
Currently, biometrics is used in many systems and applications in technical, scientific, engineering, and social challenges fields. Recently, there have many successful biometric systems. These include the hand geometry systems which are used to control access to premises such power plants, offices, factories, and campuses. Automated fingerprint identification systems (AFISs) are used to integrate automatic and manual processes in civilian and criminal applications (National Research Council, 2010). Despite this success, biometrics have been faced with challenges of being subjected to unrealistic expectations. Simply, individuals always want this systems to be 100% secure, which is impossible. In practice, however, a system need not be perfect; rather, it must have satisfactory performance. Due to the inherent vulnerability of most biometric systems, which is attributed to a combination of having high error rate, weaknesses in robustness, and system securities, they have not been effectively applied over a huge population.
Research Opportunities
Although current developments on biometrics has proved to have significant benefits to the society in general, overall, the biometric system is prone to attach, In particular, the system can be improved by enhancing research on some of its weaknesses. Research indicates that this field can be improved through better scientific interpretations on human factors, solving modality-related challenges, understanding the underlying phenomena, and checking on statistical engineering aspects of various activities (National Research Council, 2010).
Vulnerabilities of Biometric Systems
Abdullayeva, Imamverdiyev, Musayev, & Wayman (2009).

  1. Presenting fake sample
  2. Replay of stored digital biometric signal
  3. Denial of feature extraction
  4. Spoofing of biometric feature
  5. Attacking matching module
  6. Spoofing templates in database
  7. Attacking channel between the template and database and matching module
  8. Attacking the final decision process

Human Factors and Affordance
One important aspect of biometric recognition process is the input of human characteristics. This information is essential since biometric systems are mainly used for human recognition; therefore, the understanding of the interface through which they operate is essential in making them more efficient (National Research Council, 2010). In particular, very little effort has been put into ensuring that biometric systems are affordable to most of the public.
Quality is another essential character of the human factors that surround these systems. Basically, the quality of the data collected and stored into these systems is assumed to match. In this case, such information enables the information in the data to be tightly linked with individuals. Since system operators, just like other individuals are subject to human error, these individuals face challenges of human error when interfacing with the system (Sullivan, 2011). In particular, they are subject of human bias when establishing when and how to collect proper images, recognize quality of images, and in ensuring the data has been properly preserved.
Distinctiveness and Stability of Biometric Systems
            There has been a lot of questioning on the character of the information collected and stored in biometric systems. This information primarily revolves on the distinctiveness of biometric information. In this case, scientific studies have not yet concluded that individual information in biometric systems is usually unique, and can be sustainably relied on (Sullivan, 2011). Further, depending on the unique biometric information captured, the changes in the physical traits of a person over time may affect the ability of these systems to correctly identify individuals.
Modality Related Research
All biometric systems rely on one or more biometric modalities. Generally, the choice of the modality in use significantly determines the structure of the system and how information is presented to users. Similarly, it also establishes how match and non-match decisions are made in the biometric system. Facial recognition should be able to be distinguish the surrounding environement (Sullivan, 2011). Additionally, biometric systems must be able to capture variant representations depending on changes in individuals’ expressions, poses, and also in different illuminations (National Research Council, 2010). In fingerprint readers, the main concern normally entails reducing the failure to acquire (FTA) or failure to enroll (FTE) rates. Typically, engineers attempt to develop machines that have better sensors, those that can capture better images, and also those with high resolution cameras (Sullivan, 2011). In iris recognition systems, the primary research opportunities are in the optimization of illumination spectrum, reduction of the FTE and FTA rates, reduction in size of hardware, and enhanced ability to read the iris from greater distances.
Information Security Research
Normally, biometric systems usually pose the challenge of: providing security for information systems, and determining the security, reliability, and integrity of the system (Sullivan, 2011). In the first challenge, it shows that biometric systems can be misleading when information in their database becomes corrupted (National Research Council, 2010). Accordingly, there is need for detailed security research to prevent attacks on the biometric systems, so that they can always replay the previously captured information, and also properly conceal this data.
Current Research Program
Although the existing biometric systems are more advanced than those that existed in the yesteryears, they are still not 100% efficient. In part, their ineffectiveness is due to their inherent problems due to the fact that they also require some level of human input, and that machines are prone to failure. Currently, most research on biometric systems aims at eliminating the existing challenges that are present in the existing equipment. One company that has been in the front line in ensuring that biometric systems are more effective is Hitachi.
The current use of biometrics has been important in enabling users to access their information easily and safely. At present, most individuals have between 25 and 50 passwords, which is extremely difficult for a user to recall. Unfortunately, not all biometric systems are full proof. For example, many iris scanner and fingerprint readers can be hacked using just two-dimensional photographs. More advanced systems require individuals to have three-dimensional images. At the moment, the Hitachi Digital Security solution set, appears to be 100 percent secure. Hitachi’s system is usually connected to readers that use near-infrared rays to see the image of a user’s vein pattern (Dolamore, 2017). Furthermore, the readers only authenticate fingers of people who are alive (Dolamore, 2017). Consequently, this system is extremely difficult to steal. In this regard, the vein authentication system is viewed as been very secure and accurate, and also more inclusive with regards to people who can use it.
In addition to the contemporary biometric systems, recent advancement in technology have led to the emergence of biological implants. In Sweden for example, one Swedish rail company currently offers its clients the option of using implants into their hands instead of using paper train tickets (Coffey, 2017). This tiny chips use the near field communication technology, which enables passengers to pay their fares by simply passing their hands over a scanner. In the same vein, the Wisconsin vending machine company is pioneering the wireless payment of snacks in it machines. In 2017, the company launched a test program using fifty employees to observe the effectiveness of the wireless payment, which is also done by customers’ passing their hands over a scanner (Darrow, 2017).
Figure 2
Embedded Microchip
 
(Darrow, 2017)
Usefulness of Biometric Systems
Typically, biometric systems are used in monitoring physical access entry, regulating time attendance, logical access control, surveillance, and in law enforcement. The use of biometric systems has significant advantages over other traditional passwords in the following ways:

  1. Platforms that use biometrics enable a person to easily log into a network using a shorter period than passwords.
  2. It is difficult to steal most biometric information, which makes these systems difficult to hack.
  3. The use of biometrics enables organization to reduce the cost of resetting passwords (Cook, Augusto, & Jakkula, 2007).

In physical access control, biometric systems are used to give or deny entry to various individuals. Traditionally, people used to have keys and budges, unfortunately, these systems were ineffective since these items could easily be copied or duplicated. Finger print recognition and hand geometry recognition are the most commonly used in this systems. Some companies have also installed vein pattern recognition to make these systems more tamper-proof. In most of these instances, the biometric readers are usually connected to an electromagnetic lock strike that opens after confirmation of an individual’s identity. One of the main advantage of this system is that it eliminates cases of lost or stolen identity cards. Additionally, it ensures that only individuals who have been fully identified are able to access secure premises.
The time and attendance function of biometric systems mostly occurs in factories. This system ensure that individuals’ clock-in time to work and when they leave is accurately recorded. The main advantage of this system is that it eliminates cases of “buddy punching” since workers have to physically clock in and out (Cook, Augusto, & Jakkula, 2007). Further, since the information is always digitally recorded, it enable timely recording and verification of data. Finally, this system enables an almost 100% automation of the human resource department, especially with regards to calculation of remuneration
Biometric systems are also very effective in the application of logical access control. In contemporary systems, passwords and usernames were mainly used in computers. Unfortunately, these security methods were prone to cyber-attacks, especially when a hacker user dictionary style or denial of service attacks. To overcome this challenge, most organizations require their employees to use long and complex passwords, and to regularly change these logins. While these measures are appropriate, they unfortunately make it very difficult for employees to recall their logins. Alternatively, employees write these passwords in books or on their desks, which comprises the required security measures (Cook, Augusto, & Jakkula, 2007). To overcome these challenge, most organizations have introduced biometric security measures. Importantly, biometrics enable employees to access the network quickly. Also, these system is more secure since it is extremely difficult to steal a person’s physiological and behavioral traits. Finally, organizations are usually able to avoid the financial cost of retrieving passwords.
A combination of various biometric tools are used to survey individuals, primarily with an aim of identifying criminals. In this system, the modality is deployed in CCTV cameras where it is used to identify known criminals or suspects. The most common forms of surveillance are overt surveillance, covert surveillance, tracking of individuals in watch lists, tacking of persons with suspicious behaviors, and tracking of people who have suspicious activities. In overt surveillance, the public is usually aware that it is being watched. In covert surveillance, individuals usually do not know that are being observed (Abdullayeva, Imamverdiyev, Musayev, & Wayman, 2009). The tracking of individuals on a watch lists is usually done with an objective of identifying a confirmed criminal. The tracking of individuals for suspicious behavior is usually a macro type surveillance that aims at identifying potential criminals. Finally, the tracking of individuals for suspicious activities normally uses CCTV cameras and facial recognition systems to track persons who are involved in criminal activities.
Biometric systems are also used in law enforcement. The most popular method is the fingerprint recognition technique. Additionally, international organizations also use iris, facial, as well as vein pattern recognition systems. Currently, most security organizations have databases that have the biometric identity of criminals. INTERPOL for example, has data of more than 55 million criminals. Therefore, biometric data of suspects is usually quickly searched in these data bases and used to verify if these persons are the criminals being searched.
Cultural Impact of Biometrics
            Cross cultural attitudinal differences determine how people in different world regions adopt various biometric systems. In more conservative cultures, biometrics are viewed as been too intrusive, and even going against a communities norms (Cook, Augusto, & Jakkula, 2007). For example, in ultra conservative Jewish communities, alteration of a person’s body is unacceptable. Therefore, these communities are usually against biometric implants as they are viewed as being too intrusive.
The intrusive nature of biometrics also affects how different societies accept various biometric technologies. Normally, biometric systems are usually intrusive since they have personal information about an individuals. Therefore, in communities where people are reserved, and highly cherish their privacy, there is usually a low acceptance of these systems. In South Korea for example, most people quickly accept the loss of their privacy, which has enabled the country to establish biometric systems much easily.
Finally, although biometrics systems are able to easily capture and retrieve biometric information, these applications have led to the establishment of a culture of isolation. In this case, most individuals tend to spend a lot of their time privately since they can access privately access what they want (Cook, Augusto, & Jakkula, 2007). Accordingly, the presence of biometrics has led to a reduction in human socialization and interactions.
In conclusion, although biometric systems have enabled individuals to enhance their security systems, these tools are not 100% effective. A combination of both traditional and these continuous monitoring of biometric equipment is needed in ensuring that data contained in these systems is accurate, and can be relied in various biometric activities. With the current advancement and embracement of technology by most societies, the current biometric equipment will continue to improve so they can reduce the inherent vulnerabilities that are found in all of them.
 
References
Abdullayeva, F., Imamverdiyev, Y., Musayev, V., & Wayman, J. (2009). Analysis of security vulnerabilities in biometric systems. San Jose, CA: Institute of Information Technology of ANAS. Retrieved from https://danishbiometrics.files.wordpress.com/2009/08/1-13.pdf
Anderson, J., & Vaughn, R.  (1991, September 3).  A guide to understanding identification and authentication in trusted systems, NCSC-TG-017.  Ft. George G. Meade, MD: National Computer Security Center.
Ardiley, S.  (2012, March 19).  History of the common access card (CAC).  Security InfoWatch.  Retrieved from http://www.securityinfowatch.com/article/10653434/history-of-the-common-access-card-cac.
Bowerman, M.  (2017, July 24).  Wisconsin company to install rice-sized microchips in employees.  USA Today.  Retrieved from https://www.usatoday.com/story/tech/nation-now/2017/07/24/wisconsin-company-install-rice-sized-microchips-employees/503867001/.
Cook, D., Augusto, J., & Jakkula, V. (2007). Ambient intelligence: Technologies, applications, and opportunities. Pullma, WA: Washing State University.
Coffey, H. (2017). The future is here – a Swedish rail company is trialling letting passengers use biometric chips as tickets. Independent. Retrieved from http://www.independent.co.uk/travel/news-and-advice/sj-rail-train-tickets-hand-implant-microchip-biometric-sweden-a7793641.html.
Darrow, B. (2017). ‘Ick Factor’ and biometrics limit market for implanted security chips. Retrieved from http://fortune.com/2017/08/02/implanted-chips-vs-biometrics/.
Dolamore, K. (2017). Can biometric technology make our society more secure? The Telegraph. Retrieved from http://www.telegraph.co.uk/business/social-innovation/biometric-technology/.
Hiscott, R.  (2013, December 30).  The evolution of the password – and why it’s still far from safe.  Mashable.  Retrieved from http://mashable.com/2013/12/30/history-of-the-password/#PqNUogCzQsqE.
Riley, C., et al.  (2009, October).  Culture & biometrics: Regional differences in the perception of biometric authentication technologies.  ResearchGate.  Retrieved from https://www.researchgate.net/publication/220414954_Culture_biometrics_Regional_differences_in_the_perception_of_biometric_authentication_technologies.
National Research Council. (2010). Biometric recognition. Challenges and Opportunities. Washington, DC: The National Academies Press. https://doi.org/10.17226/12720.
Subban, R., & Mankame, D. P.  (2013, May).  A study of biometric approach using fingerprint recognition.  Lecture Notes on Software Engineering 1(2): 209-213.
Sullivan, C. (2011). Digital identity-Inherent vulnerabilities. JSTOR, 1-13.
Yadav, S., et al.  (2015, June).  Fingerprint recognition based on minutiae information.  International Journal of Computer Applications 120(10): 39-42.  Retrieved from http://research.ijcaonline.org/volume120/number10/pxc3903862.pdf.