On March 26, 2022, a violent assault occurred on a Maryland Transit Administration bus in suburban Baltimore. The attacker punched the feminine bus driver a number of occasions after an argument over COVID masking guidelines, then fled along with her mobile phone.
In line with a current New Yorker report, surveillance cameras captured photographs of the assailant. Transit police used these photos to create a Be on the Lookout bulletin distributed to regulation enforcement businesses.
An analyst on the Harford County state’s lawyer’s workplace ran one surveillance picture by means of facial recognition software program. The algorithm matched Alonzo Cornelius Sawyer, a black man in his 50s from Abingdon, MD.
Sawyer was arrested days later whereas showing in courtroom for an unrelated case.
Police interrogated him and confirmed him the BOLO photographs, which he insisted weren’t of him — however they dismissed his claims after his probation officer, Arron Daugherty, positively recognized Sawyer because the attacker upon viewing the footage. Daugherty had solely met Sawyer briefly twice earlier than whereas Sawyer wore a masks.
Sawyer’s spouse, Carronne Jones-Sawyer, additionally adamantly denied the pictures confirmed her husband, citing bodily variations in age, construct, clothes and extra. She supplied potential alibis, putting Sawyer away from the scene when the assault occurred. Nevertheless, detectives performed no additional investigation to corroborate the facial recognition match.
AI racial bias
This case exemplifies the dangers of overreliance on AI instruments with out enough requirements.
Racial bias leads facial recognition methods to misidentify folks of coloration at a lot greater charges. The algorithmic match outweighed contradictory eyewitness testimony within the police investigation.
After a month in jail, prices had been dropped when Daugherty admitted doubts after assembly with Sawyer’s spouse. Using facial recognition was by no means disclosed to Sawyer. Neither he nor his spouse had been notified when one other man was finally arrested.
The story highlights issues about insufficient coaching in facial recognition, lack of corroboration, failure to reveal the usage of the know-how and affirmation bias, main police to dismiss contradictory proof.
Critics argue that facial recognition utilization must be banned or strictly restricted, given its potential for abuse and entrenching injustice with out oversight. Sawyer believes he would have falsely pleaded responsible with out his spouse’s intervention, exhibiting how the apply can allow overzealous prosecution.
As speedy AI developments unfold, the general public wants safety towards unproven applied sciences. Sawyer’s expertise underscores the pressing want for reform, transparency, and accountability to forestall extra wrongful arrests.
Featured Picture Credit score: Cottonbro Studio; Pexels; Thanks!
The put up Defective facial recognition results in false imprisonment appeared first on ReadWrite.