Skip to content
Link copied to clipboard

Philly police use of Clearview AI was just 'a test’ — but facial recognition is already here

The investigative applications are numerous, police said. But for civil-rights advocates, it’s also a new frontier of concern.

The real time crime command center is shown, in the Anti-Terrorism unit building in South Philadelphia.
The real time crime command center is shown, in the Anti-Terrorism unit building in South Philadelphia.Read moreJOSE F. MORENO / Staff Photographer

Philadelphia police have been quietly experimenting with Clearview AI, the controversial facial-recognition company that mines billions of images from social media and across the web, a department spokesperson said, confirming a recent BuzzFeed report.

The thousand or so searches the department ran last November were a test, according to the spokesperson, Staff Inspector Sekou Kinebrew. The department has not yet finished drafting a facial-recognition policy, he said, but one will be disseminated to staff in the near future.

But facial recognition is already a reality in Philadelphia, where police have access to a database called the Pennsylvania Justice Network (JNET) — and it could be an even bigger part of the future. The Streets Department currently has a request for proposals to upgrade the city’s 100,000 streetlights to smart LED lights. In addition to numerous benign features — light sensors, temperature gauges and humidity readings — smart-light capabilities include license plate readers, microphones, and cameras with facial recognition.

Philadelphia has not ruled out including any of those features in its eventual “smart cities” investment, a city spokesperson said. And the Police Department has not yet decided whether it will pursue a contract with Clearview AI.

The investigative applications of facial recognition are numerous, Kinebrew said. But for civil-rights advocates, it’s also a new frontier of concern — one with potential constitutional implications.

Clearview AI in particular has raised alarms. New Jersey Attorney General Gurbir S. Grewal issued a blanket ban on its use in the state in January. That announcement followed a New York Times investigation that raised concerns about the security of the servers holding law-enforcement data. One expert warned, “The weaponization possibilities of this are endless.” The paper also revealed that the company’s investors and clients used it to spy on dates and at business gatherings.

Paul Hetznecker, a defense and civil-rights lawyer, said ubiquitous facial-recognition cameras could raise the same privacy concerns that the U.S. Supreme Court found unacceptable in Carpenter v. United States. The court in 2018 found that using cell-site records to track a person’s movement over time was a privacy violation.

“With facial recognition, it’s even more dangerous and more invasive,” Hetznecker said. “We’re moving toward a complete surveillance state.... It means your movement through public space becomes not a free and unfettered movement through a free society. It becomes something easily tracked and documented.”

A coalition of social-justice organizers called Philly Tech Justice is beginning to organize around digital surveillance, specifically facial recognition, said Devren Washington of the Media Mobilizing Project.

A resolution before City Council calls for enhanced video surveillance near schools as a response to gun violence. To Washington, the intentions are good, but the technology is vulnerable to abuse — it’s like a high-tech answer to stop-and-frisk.

“They ended up using stop and frisk against a whole slew of people who are innocent of doing anything but looking suspicious," he said. “These investments in surveillance technologies are at the expense of investments into wraparound services” — for instance, job training and treatment programs — "which have been shown across the country to massively reduce crime.”

Kinebrew said facial recognition software is viewed only as an investigative tool, to be used “in a discreet manner that limits the possibility of police action based on unverified results."

In addition, he said in an email, “Facial recognition software could be effective for identifying persons with Alzheimer’s or dementia, as well as unidentified homicide victim cases.”

JNET, the facial-recognition software in use by the department, can run comparisons with more than 35 million images, including PennDot’s database of driver photographs, as well as photos from state and county probation, parole, and prison databases. A brochure quotes Philadelphia Police Sgt. Andrew Callaghan from 2018 after a series of heroin and fentanyl arrests.

“This would not have happened without the magic of JNET," he said.