Thanks to massive gains in accuracy and lower costs, facial recognition is better than ever and its applications for governments are growing. But with the technology’s adoption come increased threats to personal data.
Flickr/US Customs/Donna Burton
In 2017, the Washington County Sheriff’s Office in Hillsboro, Ore., became one of the first law enforcement agencies in the country to use Amazon’s facial recognition software program, known as Rekognition. Powered by the latest in artificial intelligence, the tool allows Washington County officers to look for matches of a suspect’s face by quickly sorting through more than 300,000 mug shots in a county database that goes back to 2001, according to a report in The Washington Post.
Sheriffs have used facial recognition more than a thousand times since the program was installed more than 18 months ago, and the software has led to dozens of arrests for theft, violence and other crimes, according to the Post, but its use has also raised some red flags among privacy advocates who question its accuracy, especially for individuals with dark-skinned faces, and how the police decide to use the technology.
The facial recognition market is growing rapidly ($7.76 billion value globally by 2022, according to the research firm MarketsandMarkets) as demand for surveillance, monitoring and identity grows. But as the technology becomes increasingly accurate, it could further diminish what privacy we have left, say critics. That has set off alarm bells, demands for stricter regulations and laws on how and when it should be used, and, in some cases, outright bans.
Biometric identification technology has been around for decades. Fingerprinting became a high-tech, automated process with the development of automated fingerprint identification systems (AFIS) in the 1980s. In 1999, the FBI launched a nationwide version of AFIS, which now contains fingerprints from more than 143 million individuals. During the same period, facial recognition technology began to emerge, thanks to some early generation algorithms that could map a subject’s facial features and measure differences between faces that the computer can compare in a database containing thousands, even millions of images.
Because of limits in computing power and the lack of sophistication with the algorithms, facial recognition was not considered a versatile, practical tool. But that has changed in recent years. “There have been a number of recent advances in face biometrics, primarily in its accuracy, thanks to new, breakthrough technologies,” said Daniel Asraf, senior vice president of Biometric Identity Solutions at Gemalto, part of Thales, a French multinational firm.
The breakthrough has been in the field of artificial intelligence, which has grown rapidly in strength and performance. So-called convolutional neural networks have led to “massive gains” in accuracy in the past five years, according to the National Institute of Standards and Technology. “Prior to neural networks, existing technology just couldn’t reach the accuracy needed to make facial recognition versatile and useful, said Asraf. “But the neural networks of today are like an offshoot of the brain, giving them a unique ability to recognize images.”
Asraf believes two other factors have contributed to the rise of facial recognition: more computational capacity, which means the algorithms can run on silicon chips tailor-made for faster processing; and better optics and sensors that can render images at extremely high fidelity. “We have much better cameras that can capture images in huge variations of environmental conditions, compared to the resolutions available a decade ago,” he said.
Yet another factor that has boosted the rapid growth is cost. “The price points on supporting technologies — cloud, processing power, memory — are much cheaper than they were 10-15 years ago,” said Benji Hutchinson, vice president of federal operations for NEC’s Advanced Recognition Division. When Amazon rolled out Rekognition, it was as an easy-to-use tool that does not require a heavy investment in hardware, and is available at very low prices. Washington County’s Sheriff’s Office pays about $7 a month for its image searches, according to the Post.
Civil rights groups including the ACLU unsuccessfully lobbied Amazon to stop selling its Rekognition software to government.
In the early days of facial recognition, limits on algorithms, computing power and other capabilities meant that using the tool required humans as much as computers.
“You had human testers who upgraded the algorithms in a very tactical way,” said Hutchinson. “Humans would record whether or not matches worked well, and then they would tweak the algorithms accordingly.”
Today, that work is all done by computers. The data sets of images are also much larger and, when combined with AI, make the algorithms much more robust, explained Hutchinson. The result is facial recognition that can work from less-than-perfect images. “Facial recognition used to need a fairly pristine image to get a good match,” he said. “Today, that’s not the case. The images you are checking against in a database can be off-angle, even a profile.”
This capability has led to the point where certain facial recognition systems can work in real time, identifying images on the fly. With performance like that, facial recognition has become a much more versatile way to identify images of people. And, with the growth in image databases, the technology is no longer a highly specialized, expensive tool that only a few agencies and organizations can afford to use.
Matthew Zeiler, founder and CEO of Clarifai, an AI firm that specializes in visual recognition, says that as the technology has improved, the number of use cases has grown, particularly in the private sector, where certain industries, such as hospitality, use it to tag images so they can be easily found by computer. It is also being used to automatically filter out unwanted online content, such as drugs, weapons and nudity. Globally, many countries are beginning to use facial recognition for airport and hospital security, while the military is putting the technology into drones that fly over battlefields.
For the public sector, advances in facial recognition technology have opened the door to uses in a variety of fields. Zeiler said his firm has worked with agencies and organizations that would like to use Clarifai’s platform and tools to find and identify people during natural disasters, such as tornadoes, hurricanes, floods or fires. “In these scenarios, when it might not be safe or possible to send humans into the disaster area, drones equipped with cameras using facial recognition might be able to identify specific people,” he said.
School systems have shown interest in the technology as a way to provide security in an age where school shootings have become a grim reality. Departments of motor vehicles, which receive thousands of photographs on a daily basis, can use facial recognition to detect whether the applicant is in the database under another name to evade a driving ban. Correctional facilities are also using the technology as a way to monitor the location of inmates.
But law enforcement remains the prime sector in state and local government for facial recognition technology. According to the IJIS Institute, a nonprofit alliance of tech firms working in the public sector, the technology is typically used by law enforcement to help identify a face against a known image. Sometimes called one-to-one analysis, this identification would involve using facial recognition to ascertain whether a person presenting an image embedded in a passport or driver’s license is the same person. Police also use facial recognition to compare the image of a face to numerous known faces within a database. This function is called discovery and is sometimes referred to as one-to-many analysis.
Washington County’s Sheriff’s Office is just one of a small but growing number of law enforcement agencies that are using the latest generation of facial recognition technology. The Orlando, Fla., Police Department announced last year it would begin testing Amazon’s software before recommending it for broader use. Besides Amazon, Gemalto and NEC, a sizable number of firms have entered the market, many with solutions geared for the law enforcement community. They include: Brainchip, Cognitec, Ever AI, FaceFirst, Idemia and Suspect Technologies, to name a few. One notable firm that has stepped back from using its facial recognition software for police work is Microsoft, which recently rejected a California law enforcement agency request to install the company’s software in police cruisers and body cams out of concerns for human rights.
Microsoft’s sensitivity about how its technology might be used without proper checks and balances in place reflects the dilemma police departments face in terms of advancing crime fighting prevention using the latest technology that has the potential for abuse, intrusiveness and invasion of privacy.
Rick Myers, executive director of the Major Cities Chiefs Association, which is composed of the police chiefs from the 68 largest cities in the U.S. and the nine largest cities in Canada, is well aware of that quandary. When it comes to technology, police have learned to move with caution, given their unique role and the need for trust when it comes to protecting the safety of communities. Myers cited the example of gunshot location technology. “A few years ago, it showed great promise, but the number of false positives was quite high,” he said. “Over time, a small number of agencies that were early adopters of the technology provided feedback to vendors, who improved the technology, leading to more widespread adoption today.”
Myers described facial recognition as early in its maturation phase. Citing concerns about the inability of some facial recognition technologies to discern people of color accurately, as well as other skin types and facial shapes, he emphasized that the law enforcement community can’t afford any technology that has an inherent bias. “On the other hand, we see great potential for facial recognition and other types of artificial intelligence to really expedite our ability to identify high risk subjects, potential terrorists or serial criminals, prior to them committing another offense. That’s a highly desirable quality,” he said.
In May, San Francisco, with the blessing of the mayor and city supervisors, banned city use of facial recognition technology. Oakland is also looking to enact a similar ban. In January, a bill was filed in the Massachusetts Legislature for a statewide ban on the use of facial recognition by state agencies. The state of Washington tried, but failed, to pass a privacy law that hinged on putting strict limits on the use of facial recognition.
With stories coming out of China, where the use of facial recognition has become widespread and has been used to suppress ethnic minorities, the chorus of concerns about the use of facial recognition and its ability to strip away any semblance of privacy has grown louder. The examples of legislative and regulatory pushback against the technology has tech firms and law enforcement officials worried.
Gemalto’s Asraf believes the answer to the problem is for the tech community and government to deliver a combination of good governance and better education. “As a technology provider, we provide the tools for facial recognition, but there’s a second piece and that’s the governance of that, which the government whom we provide the tools to, needs to adhere to,” he said. As for education, Asraf called it a very important piece of how facial recognition can advance, “because there are so many misrepresentations and misconceptions about the technology.”
Myers, who has four decades of experience as a police officer and chief, has seen privacy groups attempt to stifle other technologies considered too intrusive for law enforcement. “It can take years to learn where the boundaries are with regards to emerging technologies that can benefit both law enforcement and communities,” he said.
Myers points out that the police operate within the confines of the U.S. Constitution, which provides individuals with protection against government intrusion into a person’s life. “However, what gets lost in the discussion is that there is no expectation of privacy in a public place, where you are likely to be on CCTV anyway,” he added. Myers sees benefits to using facial recognition in public spaces, along with AI that can analyze human behavior, to help police thwart a possible crime before it occurs or the actions of a terrorist, for example.
One possible public use of facial recognition in the near future could be at a large event such as the Super Bowl. If evidence suggests someone may have plans to bring a dirty bomb to the football game, cameras with facial recognition technology could be set up around the perimeter to try to identify known suspects. Such a system would have to be able to search and identify someone in real time. It’s one of the future uses of the technology.
“We have the technology today to conduct real-time facial recognition in the field using body cameras,” said Asraf. “With the rollout of 5G, we can expect to see the use of facial recognition in an edge computing environment with real-time feedback.”
Clarifai’s Matt Zeiler also sees edge computing giving facial recognition a significant boost in capabilities. Beyond that, the accuracy will continue to get better as the algorithms get stronger and faster, thanks to ongoing improvements in processing speed and optics that will deliver astounding resolution. “Better accuracy is a no-brainer,” he said.
Benji Hutchinson of NEC perhaps put facial recognition’s future in the best perspective. “It truly is a 21st-century tool.”