Orlando police decide to keep testing controversial Amazon facial recognition program

Orlando police decide to keep testing controversial Amazon facial recognition program

SAN FRANCISCO — The Orlando Police Department in Florida is planning to continue its test of a facial recognition program from Amazon, despite outcry from civil rights and privacy groups that law enforcement and government agencies could abuse the technology.

Loading...

OPD announced last month that the trial proof of concept run of the software had expired, but OPD public information officer, Sgt. Eduardo Bernal, said in a release Monday that the department will continue its testing of the program.

Two years ago, Amazon built the facial and product recognition tool, called Rekognition, as a way for customers to quickly search a database of images and look for matches. Rekognition uses Amazon’s cloud computing network AWS to compare images to a database of images the customer has provided.

At least two police departments around the country have been testing the software, and privacy advocates have protested, fearing law enforcement might abuse the technology.

In May, civil rights and privacy groups, including the American Civil Liberties Union, demanded that Amazon stop selling Rekognition to law enforcement and other government entities because they fear it could be used to unfairly target protesters, immigrants and any person just going about their daily business.

“Amazon should be protecting customers and communities, it should not be in the business of powering dangerous surveillance,” said Matt Cagle, an ACLU attorney.

The tool is used by many companies and organizations beyond law enforcement. Amazon says it has been used to find abducted people and that amusement parks have used it to find lost children.

During the royal wedding of Prince Harry and Meghan Markle in May, British broadcaster Sky News used Rekognition to help it identify celebrities at they entered Windsor Castle.

OPD is testing Rekognition with images of seven Orlando officers, who volunteered for the trial. The city created a database with pictures of the officers, then compared those  to a video stream from eight city-owned surveillance cameras to see if it could correctly identify the officers when in view of the cameras.

Orlando Police Chief John Mina said Monday the department has made good strides in the pilot program and will continue the testing to determine whether it will add “value in enhancing the City’s public safety mission in a manner that balances reasonable privacy concerns.”

No images of the public are used for testing, the facial recognition tool is not used in an investigative capacity and all elements of the program are in accordance with laws related to privacy and civil rights, Bernal said in the release.

Thirty-four civil rights groups sent a letter on May 22 to Amazon CEO Jeff Bezos, saying people should be “free to walk down the street without being watched by the government. Facial recognition in American communities threatens this freedom.”

They fear the real-time face recognition could allow police or government groups to watch crowds and pick out activists or undocumented residents.

“Amazon Rekognition is primed for abuse in the hands of governments. This product poses a grave threat to communities, including people of color and immigrants, and to the trust and respect Amazon has worked to build,” the letter states.

Police and city officials provided examples in a memo last week of how the facial recognition tool could be used. They said police could locate a wanted fugitive before he or she committed another crime, locate a suspect who has made violent threats, find a missing child or identify a registered sex offender near schools

“If this technology works, the suspects in these examples would have had their images entered into the system and perhaps could have been spotted by one of the many cameras, and never allowed to get anywhere near the victims or a large gathering,” officials said in the memo.

 

 

Source:- usatoday

Share:

Loading...
loading...