Google Contractors Reportedly Targeted Homeless People for Facial Recognition Tests

Reports have emerged of underhanded methods used to acquire the facial data of individuals.
Chris Young
Face recognitionmonsitj/iStock

Google recently admitted it had employees walking the streets in several U.S. cities asking people if they wanted to sell their facial data for $5 gift certificates. They were using this data to help improve Pixel 4's face unlock system.

Now, the New York Daily News reports that Google contractors have been using some very dubious tactics to get people's facial data.

RELATED: EU FINES GOOGLE 17 BILLION FOR UNFAIR COMPETITION

Targetting the most vulnerable

According to several sources who were working for the Daily News, a contracting agency called Randstad sent teams to Atlanta with the specific intention of finding homeless people with dark skin.

According to the report, the contractors often didn't say they were recording the individual's faces or that they were working for Google.

While it's not clear whether Google knew Randstad was targetting homeless people, a Google manager did instruct the group to target people with darker skin, the Daily News report says.

Dubious tactics

One of the methods used to collect facial data includes pretending to play "selfie games" with strangers.

As the Daily News report says, "one [source] said workers were told to say things like, “Just play with the phone for a couple minutes and get a gift card,” and, “We have a new app, try it and get $5.”

Most Popular

An ex-staffer also told the reporters, "They said to target homeless people because they’re the least likely to say anything to the media." What's more, "the homeless people didn’t know what was going on at all."

Bad deeds in the name of good?

The irony in this whole story is that the Google contractors' questionably targeted data collection is being used to create a facial recognition database that is unbiased. As The Verge reports, a lack of data on people of color has created a bias in facial recognition.

As many people have argued, it seems increasingly evident that biases inherent in technologies, such as AI and facial recognition, are reflective of their creators. 

message circleSHOW COMMENT (1)chevron