Edmonton Police Service partners with U.S. company to test use of facial-recognition bodycams


The Edmonton Police Service announced Tuesday that it will become the first police force in the world to use an artificial intelligence (AI) product from Axon Enterprise to test body cameras with facial recognition.

“I want to be clear that this facial recognition technology will not replace the human component of investigative work,” the acting superintendent said. said Kurt Martin, of EPS’s information and analysis division, during a press conference.

“In fact, the similarities identified by this software will be humanly verified by agents trained in facial recognition.”

Martin said the police force’s goal is to test another tool in its operations toolbox that can help further ensure the safety of the public and officers while respecting privacy considerations.

Axon Enterprise, an Arizona-based company, develops weapons and technology products for military, law enforcement and civilians in jurisdictions where it is legal.

Starting Wednesday, up to 50 Edmonton police officers who currently wear body cameras will begin using Axon’s facial recognition-enabled cameras on their shifts for the rest of the month.

Why now?

In 2023, the provincial government announced plans to require body cameras for all Alberta police officers. For EPSs, the use of bodycams by their members began to be implemented in 2024.

The partnership with Axon is independent of the provincial mandate, Martin said.

“The proof of concept is a limited trial period to determine the feasibility of facial recognition on body-worn video cameras and its functionality within policing,” he said.

SEE | Edmonton Police Announce Plans to Test Use of Facial Recognition Technology in Body Cameras:

Edmonton Police Test AI Facial Recognition on Body Cameras

Until December, Edmonton police officers will test facial recognition technology on body-worn cameras, using it to potentially match people who interact with officers with people in the police database. Edmonton police emphasize this is just a proof of concept, but one expert expresses concern that Canadians are being used as guinea pigs.

Martin said the trial will test the technology’s ability to allow officers to use mugshots to identify people already in the system due to “officer safety cues and warnings from previous interactions.”

The technology will also allow police to assess security risks and be aware of people who have outstanding warrants for serious crimes such as murder, aggravated assault and robbery.

“In total, there are 6,341 people who have a flag or caution,” Martín said. “Currently, there are more than 20,615 charges that have been justified in Edmonton.

“As police officers, we have an obligation to attempt to execute these orders in a timely manner to ensure that persons charged with criminal offenses can be tried within a reasonable time, in accordance with their Charter rights… and in accordance with the schedule established by the Supreme Court of Canada.”

How does it work?

When agents wearing the cameras that will be part of the test are in the field, the facial recognition system will not be actively working, said Ann-Li Cooke, AI director responsible for Axon Enterprise.

He said the system is intended to be activated by officers during investigations or law enforcement, at which point the cameras will begin recording.

When these body cameras are actively recording, the facial recognition technology will automatically run in “silent mode,” Cooke said.

Officers will not receive alerts or notifications about facial resemblance while on duty.

If a person is within four meters of a body camera, their face is detected and the data is sent to the cloud for comparison with the EPS database of persons of interest.

If they don’t match, then the facial data is immediately discarded, Cooke said.

“The system actually starts with the database upload, so the Edmonton Police Service and any police agency would identify who their database would be built with,” he said, noting these include serious warrants and officer safety alerts.

“We really want to make sure it’s targeted, so it’s people with serious crimes that are uploaded to this database.”

Cooke said Axon has no control to see, dictate or govern what type of people are uploaded to the database, which is wholly owned and operated by EPS.

Once the images are captured, EPS agents trained to analyze facial recognition data will review the images to see if the software is working as intended and if there is a match.

What are the concerns?

Ian Adams is an assistant professor of criminology at the University of South Carolina, studying the intersection of policing and technology, and serving on the Criminal Justice Council’s AI working group.

Adams said caution should be taken in first understanding AI before using it among the general public.

“We are in the fastest technology adoption phase in modern policing, faster than even body cameras,” he said in an interview with CBC News.

“And unfortunately, that necessarily happens before we understand much about the use of these technologies, whether they capture the intended benefits, of course, and then their unintended consequences.”

EPS said it submitted a privacy impact assessment to Alberta’s information and privacy commissioner to determine whether the trial use of the technology respects public privacy and is conducted legally.

Diane McLeod, Alberta’s information and privacy commissioner, said she is very concerned about EPS proceeding with the use of facial recognition technology.

“From a privacy perspective, there are a number of issues with facial recognition technology, particularly around accuracy,” he said in an interview with CBC News.

“Under our Privacy Protection Act, which the EPS is subject to, they actually have a duty of accuracy, so they would need to be able to establish in their privacy impact assessment that the accuracy meets the threshold of the Privacy Protection Act.”

McLeod said facial recognition technology has a proven track record of being problematic and pointed to a report by several Canadian privacy commissioners investigating the practices of a U.S. technology company called Clearview AI.

Commissioners found that Clearview AI violated Canadian privacy laws by collecting photographs of Canadians without their knowledge or consent.

The commissioners’ report found that Clearview AI’s technology created a significant risk to people by allowing law enforcement and businesses to compare photographs to its database of more than three billion images.

Canada’s privacy commissioner, along with commissioners from Alberta, British Columbia and Quebec, issued an order in 2021 for Clearview AI to stop operating in the country and remove images of Canadians collected without their consent.

McLeod said the matter is still before the court.

A 2019 police technology ethics board and Axon AI report found that at the time, facial recognition technology was not reliable enough to ethically justify its use in body cameras.

“At a minimum, facial recognition technology should not be deployed until it performs with much greater accuracy and works equally well across races, ethnicities, genders, and other identity groups,” the report says.

Responding to the report, Cooke said there has been a development in the technology since 2019.

“At that time there are both race and gender gaps,” she said. “While we were doing our due diligence in evaluating multiple models, we were also looking to see if there were racial differences and found that, ideally, that is not the case.

“Race is not the limiting factor today, the limiting factor is skin tone. And so, when there are different conditions, such as distance [or] In low lighting, there will be different optical challenges with the body camera.[s] – and all cameras – to detect and match darker-skinned people with lighter-skinned people.”

However, Gideon Christian, associate professor of AI and law at the University of Calgary, said the inequalities associated with facial recognition technology are too great to ignore and that he believes there is not enough recent research to suggest significant improvement.

“Facial recognition technology has been shown to have its worst error rate when identifying darker-skinned people, especially black women,” he said.

In some case studies, Christian said facial recognition technology has shown an accuracy rate of about 98 percent in identifying white male faces, but also has an error rate of about 35 percent in identifying darker-skinned women.

“It was a huge surprise to me that Edmonton police chose to be the guinea pig for this experiment that infringes on Charter rights,” he said.

Christian said he thinks the implications for privacy in the public sphere are chilling.

“Basically, what we’re seeing is a situation where this tool for police accountability suddenly becomes a tool for mass surveillance.”

University of British Columbia law professor Benjamin Perrin said he believes rigorous safeguards will need to be implemented when using facial recognition technology.

“We need to see a full impact assessment on how this would affect the rights of people who are being filmed in these interactions with police,” he said.

The Edmonton Police Commission and Chief’s Committee will review the results of facial recognition body cameras before deciding on their future use in 2026.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *