Tampa police try a facial recognition project again after problems shut down its first attempt<@VM>How it works
- By William Welsh
- Jan 31, 2002
A control room in Newham, England, uses Visionics' FaceIt facial recognition system.
Screen shot shows the computer interface of Visionics' FaceIt, a facial recognition system that police in Tampa, Fla., are giving a second try.
Problems with the first major deployment of facial recognition software by a major U.S. police department are raising questions whether its capabilities have been oversold.
The police department in Tampa, Fla., deployed Visionics Corp.'s FaceIt software in June 2001 in a trial using 36 security cameras stationed in a Tampa entertainment district known as Ybor City. But two months later, the police turned it off because of a glitch in the operating system that runs the facial recognition software.
Although the police and Visionics, based in Jersey City, N.J., say they have fixed the problem and redeployed an upgraded system, the American Civil Liberties Union wants the effort halted.
Calling the facial recognition system "an overhyped failure," the New York-based ACLU said the police log showed the system was prone to false matches and that it never positively identified any missing or wanted individuals whose images are contained in the department's photographic database.
The Tampa Police Department and Visionics acknowledge the system was turned off for several months, but said they have made improvements to the system, which began operating again Jan. 17.
Police and company officials also are working to increase the number of images stored in the database and upgrade the hardware platform and software engine.
But Barry Steinhardt, the ACLU's associate director and co-author of a Jan. 3 ACLU report on the project, said the upgrades cannot overcome problems associated with capturing images at long distances and under poor lighting conditions.
"The concept of conducting widespread surveillance using facial recognition is a flawed concept. There is no upgrade that will make this more useful," Steinhardt told Washington Technology
Visionics' product did not create the problem that caused the system to be taken offline, said Bill Todd, a Tampa police department detective and the FaceIT project coordinator. The interruption in service did not reflect a lack of desire on the part of the police department to use the technology, he said.
The system's inability to identify wanted or missing persons can be partly attributed to the small size of the photographic database against which new images are compared, said Joseph Atick, Visionics' chairman and chief executive officer.
The database contains images of about 900 fugitives and missing persons, officials said. Once images are added from various sources, the database may contain more than 45,000 images.
Facial recognition systems, also known as facial scanning systems, potentially could be a significant part of the homeland security equation, according to supporters.
Visionics and other companies that manufacture facial recognition software have reported growing interest in their technology from airports, law enforcement agencies and motor vehicle departments following the Sept. 11 terrorist attacks.
Although still being refined, such systems can confirm the identity of individuals and may be useful in helping law enforcement officials apprehend criminals and terrorists.
Facial recognition companies will have to work hard to prove that their technology can make a measurable contribution to law enforcement, said Dario Stipisic, a senior consultant with the New York-based International Biometric Group.
"Moving forward, it is going to be interesting to see if facial scan vendors can improve their technologies," he said.
To upgrade the Tampa system, Visionics has installed the FaceIt Argus System. The new system has a scalable platform that allows clients to customize large-scale facial recognition systems.
While the initial installation could retrieve facial images from only one camera at a time to match against the database, the upgraded system will retrieve images from six cameras at a time, said police and company officials.
In the case of Tampa, a police officer downloads facial images through the surveillance cameras in Ybor City for comparison against the database. If no match is made, the images are discarded.
FaceIt was installed in Tampa for a one-year trial period to validate the technology in a public setting, said Frances Zelazny, a Visionics' spokeswoman. The initial installation was valued at $30,000 for one year, she said.
"If the system performs to their satisfaction, then [the Tampa police department] will keep it permanently. ... The fact that they went ahead with an upgrade is very promising," Zelazny said.
The cost of the project will not increase as a result of the upgrade, Atick said. The Visionics executive said the company was willing to install and maintain the system in Tampa to validate the technology in a law enforcement setting.
Todd said the police department learned valuable lessons from the deployment last summer, such as camera positions, response times and database management and operations.
"We've deployed [facial recognition] in the hardest environment to use it. ... We've learned some things, and we're moving forward," he said.
Visionics also has deployed FaceIt on a trial basis at four U.S. airports, including Boston's Logan International Airport and Washington Dulles International Airport, Atick said. These projects can range in cost from $30,000 to $1.5 million, depending on the number of cameras downloading facial scans, he said.
Robert Atkinson, vice president and director of the New Economy and Technology Project at the Washington-based Progressive Policy Institute, said the ACLU's overall position is to make sure the technology doesn't go forward.
"Regardless of whether the technology worked, the ACLU was going to do everything in its power to make facial scanning look like it is 'Big Brother' technology," he said.
Atkinson said that facial recognition technology is still in its early stages and is likely to improve and become more accurate as time goes on. He said the technology should be deployed in controlled environments where it is possible to get a "clean shot" of faces in good lighting conditions.
Thus, it is better suited for airport security gates, entrances to government buildings and high-profile events, such as the 2002 Winter Olympics, than it is for street surveillance.
The ACLU's Steinhardt disagreed. He said facial recognition would be ineffective even at airports because of the lack of high-quality images of terrorists to store in a database. He said the ACLU is not opposed to the use of biometrics, but believes they must be suited to the situation.
For example, biometrics such as fingerprint scanning or iris scanning might be used to confirm the identity of baggage handlers, flight crews and maintenance personnel who work in airport security areas. But facial recognition software is not sophisticated enough to do this effectively, Steinhardt said.
"If you were securing a location, this is not the biometric you would use," he said.Staff Writer William Welsh can be reached at firstname.lastname@example.org.
Facial recognition is software that identifies faces by measuring characteristics, such as the distance between the eyes, the length of the nose and the angle of the jaw, to create a unique file called a template. Matching images is done in two ways.
In one-to-one matching, two facial images are compared to determine whether they belong to the same person. This approach is typically used for information security, transaction authentication and access control applications.
The second type is one-to-many matching, and compares one facial image against a database to determine whether the newly captured image exists in the database. This approach is used in datamining, ID documents and surveillance applications.
The software uses the templates to produce a score that measures how similar the images are to each other. In the case of surveillance, the software operator sets a score above which the system sets off an alarm for a possible match. The higher the score, the fewer alarms; the lower the score, the more alarms.Sources: American Civil Liberties Union,
William Welsh is a freelance writer covering IT and defense technology.