10 Times Facial Recognition Technology Got It Really Wrong
Over the past few years, facial recognition technology has become more powerful to detect the faces more accurately. At least this is what we often hear from the advocates of this technology. The truth, however, is slightly different. Despite the fact that face-recognition systems have improved significantly, there are instances when it failed terribly and even led to some serious consequences.
Here’s how facial recognition technology got it really wrong, proving that it shouldn’t be trusted blindly-
Apple iPhone X
This recent embarrassing incident occurred during the launch of Apple’s new iPhone X, when Craig Federighi struggled to unlock the phone with its much talked about Face ID feature. “Face ID is the future of how we unlock iPhones and protects sensitive information,” says Apple. The company discarded the old Touch ID and chose this new facial recognition technology. However, this key feature didn’t perform well right on the launch day. While demonstrating the feature, Federighi was forced to enter the passcode, as the system didn’t recognize his face in the first go. The incident already invited huge criticism from the internet, but the company says that the feature is foolproof and the problem occurred due to improperly handling the phone.
Samsung S8 & Note 8
The much boasted face-recognition feature from Samsung is filled with all the flaws. There are several videos floating on the internet which shows that the face-recognition feature can be easily tricked by showing a selfie from another phone. This is quite worrying as anyone can easily access your phone, if they’ve got your selfie. However, the company acknowledged that the face-recognition feature is not meant for security and is just another way to go to Home screen rather than sliding to unlock.
Boston Marathon Bombing

Boston Marathon Bombing is another instance when facial-recognition technology failed miserably. Despite the CCTV footage of the two suspects, the face detection system used by the police didn’t identify anyone, even though the suspects were already in their database. The reason for this failure according to the investigators was due to the poor image quality, which did not work in the favor of the algorithm.
Google Photos
Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4
— OOP (@jackyalcine) June 29, 2015
Google’s popular photos app that relies massively on facial-recognition to categorize photos in the album failed terribly when it incorrectly labeled a couple as “gorillas”. Jacky Alcine, the user of this app brought it to Google’s attention when the entire collection of his photos with his friend was misclassified. Alcine made the issue public by posting the mistake on Twitter, which led Google to face a backlash, especially due to the error’s racist intentions. Google did, however, apologize for the mistake and promised that issues such as this won’t happen again. However, only time will tell how soon the company can improve the intelligence within the face recognition technology, as the app has misclassified the photos in numerous other instances.
Nikon Coolpix

In a pursuit to integrate more geeky features in gadgets, companies sometimes do end up in creating unintended consequences. The facial recognition technology within the Nikon Coolpix camera did just that. Joz Wang, one of the customers who bought the camera had to deal with a weird issue when she tried to click her picture. The camera failed to recognize the Asian face and flashed a message “Did someone blink?” every time she tried to click a portrait. It was only when her brother posed with his wide-open eyes they realized that the camera was not able to identify if the eyes were open. She posted her picture on her blog with a title “Racist Camera! No, I did not blink… I’m just Asian!” The post was quickly picked up by Gizmodo and Boing Boing. What is surprising though is that Nikon being a Japanese company, did not design the camera considering Asians.
Notting Hill Carnival

Despite the resistance, London’s Met police used its controversial and inaccurate automated facial recognition system second year in a row and it proved more than useless. The system identified 35 false matches and an “erroneous arrest” for a rioting offense. Although the system failed terribly, Met Police finds it successful. The Met said: "We have always maintained that it was a continued trial to test the technology and assess if it could assist police in identifying known offenders in large events, in order to protect the wider public."
HP Webcams
In 2012, HP had to face an awkward situation when its new webcams with face-tracking feature failed to recognize black faces. One of its users posted a video on YouTube to demonstrate the error in the webcam, which quickly went viral over the internet. The company, however, did respond to them by citing the contrast intensity as the issue behind the problem.
“We are working with our partners to learn more. The technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose. We believe that the camera might have difficulty seeing contrast in conditions where there is insufficient foreground lighting.”
Fake Driver’s License

The Massachusetts State Police falsely identified a driver’s license as fake due to a failure of the facial recognition system to identify the person. John H. Gass of Needham had to prove his identity when he was informed that his driver’s license was fake. The issue, however, was with the security system that found John’s face similar with one of the millions of drivers of Massachusetts. While the officials say that false-positives with such systems are inevitable, the concern is what would happen if someone gets arrested just because the face-recognition system finds him similar to one of the wanted criminals?
New Zealand Passport Renewal Software

Facial recognition software got the New Zealand Government into trouble after it failed to recognize the picture of an Asian man.
[see-also]
Richard Lee, who wanted to renew his passport, was surprised when the passport renewal system that utilized facial recognition technology did not accept his photo. An awkward error message popped up on the screen saying: “The photo you want to upload does not meet our criteria because: Subject eyes are closed”. Lee got huge support from the locals, but he took the issue humorously. The New Zealand Department of Internal Affairs however responded saying that their software is one of the most advanced in the world and the error was due to uneven lighting on the individual’s face.
FBI’s Facial Recognition System

This is probably one of the scariest incidents when facial recognition system got it all wrong. Steve Talley, a financial advisor from Denver was falsely accused twice for holding up two banks. The FBI’s facial recognition system found similarities between Steve Talley and the man who robbed the banks. However, the charges were later dropped as the facial examiner failed to identify a mole on Talley’s right cheek, followed by a height analysis which showed Talley was three inches taller. The arrest, however, caused Talley everything. He ended up losing his job and family. He got injuries during arrests and is homeless. He has filed a lawsuit for the damage and is seeking $10 million.