Biometric governance is an asset for some police officers


Following the lead of law enforcement who are tired of tedious think-before-shooting lectures, some police officers don’t have time for ethical guidelines when it comes to officers using reconnaissance facial and related technologies.

An article by publisher Tech Monitor might surprise even proponents of biometrics used in law enforcement.

This quote UK Police Minister Kit Malthouse says helping officers understand the outsized impact biometrics can have on the people they serve could ‘stifle innovation’ in surveillance technologies and identification.

MP Malthouse, according to Tech Monitor, told a House of Lords committee there was more to lose in terms of tackling crime than there was to gain, for example by setting up a governing body responsible for reviewing ethical use guidelines.

Frameworks, he explained, are “usually for more mature technology.”

In fact, Malthouse suggested that Parliament itself can create, judge and manage the ethical behavior of police departments using AI systems to prevent crime.

A possible flaw in this argument could be that agencies across the country are buying and deploying systems now without coordination let alone best practices.

(Not everyone agrees in UK national politics. Ethics are a matter of debate.)

The same is true at the regional level in the United States.

Any discussion at the national level about ethics guidelines for law enforcement is just that. Clearview AI, a facial biometrics provider that has been condemned by the courts for its “publicly viewable content has no privacy protection” business model, has just signed up with the Federal Bureau of Investigation for a one-year subscription .

A joint investigation into how facial recognition systems are used by the Pulitzer Center and the South Florida Sun Sentinel reveals resistance to ethics policies by police officers.

Palm Beach and Broward counties in Florida perform more facial scans than nearly all sheriff’s offices in the state, according to to the resulting article, and county police departments refuse to create biometrics governance policies. Palm Beach and Broward reportedly launched 9,000 scans from February 2020 to June 2021.

The system they use is known as “FACES”, launched 20 years ago.

Criticism fell on Fort Lauderdale, Broward County, in the spring of 2020 when police used the FACES facial recognition system to try to identify people peacefully protesting the police killing of George Floyd.

The city’s police department said it would create ethical use policies following the incident.

The Broward County Sheriff’s Office’s denial helps demonstrate that coordinated policies are needed. County law enforcement officials endorse a freer hand than those in the larger city of Broward.

Back in the UK, the Nottinghamshire Police Service to extinguish a press release about a pilot test of an indefinite facial recognition system.

Granted, the document is written for public consumption, but the only apparent policy is to increase the rate at which suspects can be identified, arrested, and brought before a judge.

Speed ​​in policing is good, but historically its pursuit can create an overreliance on techniques and technology. There may be a handy internal policy, but if the posting is accurate, Nottinghamshire seems set to put people in front of the judges in a pilot, which is usually the time to get rid of bugs and get to grips with a new technology or a new product.

Article topics

AI | biometric identification | biometrics | ethics | facial recognition | font | regulation | monitoring | United Kingdom | United States


Comments are closed.