Man Files For Damages After The First Case Of Facial Recognition Leading To Wrongful Arrest
As artificial intelligence and machine learning have entered into our daily lives, nothing has been more hotly debated than facial recognition. Governments and law officials claim we need facial recognition in surveillance cameras as a method to fight crime, while the public has pushed back, stating the technology infringes on freedom rights and ethics. Regardless of the opposition, law enforcement in many countries has implemented their use, including the USA and China. Now, following a serious blunder by US law enforcement, a lawsuit has been filed in the city of Detroit for the first case of a wrongful arrest as a result of facial recognition technology. Robert Williams claims he was not even through his front door when police made the arrest, suspecting him of shoplifting $4,000 worth of watches in an incident over a year prior. “I came home from work and was arrested in my driveway in front of my wife and daughters, who watched in tears, because a computer made an error,” Williams said in a statement.
“This never should have happened, and I want to make sure that this painful experience never happens to anyone else.” The arrest happened after some video footage captured at the scene of the crime was run through facial recognition software. Despite the footage being relatively ambiguous – with poor lighting and the shoplifter never actually looking directly at the camera – the facial recognition technology returned Williams as a match. This match, supposedly, directly led to the arrest of Williams. However, since the arrest, the police involved have backtracked and admitted that “the computer got it wrong”, with Williams only being arrested because of a false likeness to the perpetrator. “As a result, Mr. Williams was arrested without explanation on his front lawn in plain daylight in front of his wife and children, humiliated, and jailed in a dirty, overcrowded cell for approximately 30 hours where he had to sleep on bare concrete—all for no reason other than being someone a computer thought looked like a shoplifter” states the lawsuit, which was filed on Tuesday, April 13.
This marks the first time facial recognition has been blamed for the wrongful arrest of a person. Local law enforcement has since disputed the claim, stating that it was in fact not the facial recognition, but “just bad detective work”. "Facial recognition was used, but that's not why the arrest was bad." Said Detroit Police Chief James Craig in a statement, reports The Detroit News. Facial recognition is currently being used in everything, from law enforcement to unlocking your phone. It involves deep learning algorithms to correctly identify faces in real-time by analyzing distances between facial features, shapes, and specified nodal points to create a unique numerical signature for each person. However, facial recognition – like all AI – is currently still in its’ infancy, and is prone to being wrong. This appears to be heightened when the technology tries to identify black faces apart. Current iterations of facial recognition show a significant drop-off in accuracy when trying to identify people with darker skin tones, showing a clear racial inequity that means it should probably not be used in law enforcement just yet. Williams’ attorneys are claiming this may have happened here, and that Detroit should consider following others cities’ footsteps in banning the technology.
Thousands Of People Labeled As "Criminals" By Facial Recognition Software Were Actually Innocent
Last year, British police departments in South Wales and Leicestershire started trialing a facial recognition technology to track down suspected criminals when they are out and about. In theory, this should cut the amount of time spent looking for and identifying lawbreakers. In reality, it is a bit of a mess.That's because the facial recognition technology is not actually that good at recognizing faces.Take this one example. During a 2017 football (or soccer) match between Real Madrid and Juventus, over 2,000 fans were mistakenly identified as potential offenders – of 2,470 individuals flagged by the system, 2,297 (92 percent) were “false positives”. The software relies on cameras to scan and identify faces in a crowd and check them against a photo bank of custody images. If there's a match, the person on shift will consider it and disregard it or, if they agree with the algorithm, dispatch an intervention team to question the suspect. However, a big problem lies in the fact that these images are often poor quality and blurry. This means you only have to vaguely resemble a person in one of the custody images to be flagged on the system as a possible felon.
South Wales Police admitted “no facial recognition system is 100% accurate” in a statement. This is a bit of an understatement. There have been not one but several instances when false positives have vastly outnumbered true positives, including an Anthony Joshua fight where 46 fans were incorrectly identified and a Wales vs Australia rugby match where 43 fans were incorrectly identified."I think the false positive rates are disappointingly realistic," Martin Evison a forensic science professor at Northumbria University, told Wired. "If you get a false positive match, you automatically make a suspect of somebody that is perfectly innocent."There are also concerns about privacy, particularly as there is so little legal oversight regarding this type of technology. Big Brother Watch, a UK civil rights group, is in the process of planning a campaign against facial recognition, which they intend to bring to parliament later in the month.
"Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool," the group tweeted.
Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool. Big Brother Watch has more to come on this, soon. Watch this space!https://t.co/9i058Y5pP6— Big Brother Watch (@bbw1984) May 4, 2018
However, others argue that this type of mass surveillance is needed to keep the public safe in crowded spaces.“We need to use technology when we’ve got tens of thousands of people in those crowds to protect everybody, and we are getting some great results from that,” Chief Constable Matt Jukes told the BBC.“But we don’t take the use of it lightly and we are being really serious about making sure it is accurate.”It hasn't been a total failure. South Wales Police claim it has helped catch and arrest 450 criminals, since it's launch in June 2017. They also say that no one has been wrongly arrested. “With each deployment of the technology we have gained confidence in the technology and has enabled the developers at NEC to integrate our findings into their technology updates,” a spokesperson for South West Police explained.