Join

Enter Details

Comment on stories, receive email newsletters & alerts.

@
This is your permanent identity for Business Insider Australia
Your email must be valid for account activation
Minimum of 8 standard keyboard characters

Subscribe

Email newsletters but will contain a brief summary of our top stories and news alerts.

Forgotten Password

Enter Details


Back to log in

A major flaw in Google's algorithm allegedly tagged two black people's faces with the word 'gorillas'

Google has apologised to a black man who says the search giant’s photo algorithms sorted pictures of him and a friend, also black, under the category “gorillas.”

“Google Photos, y’all f**ked up,” Jacky Alciné wrote on Twitter, accompanying a screengrab of the photo and the racially offensive tag. “My friend’s not a gorilla.”

 

Google says it’s now trying to figure out how this happened, the Wall Street Journal reports. The answer is probably an error within Google Photo’s facial recognition technology.

They gave this statement to WSJ:

“We’re appalled and genuinely sorry that this happened,” a company spokeswoman said. “There is still clearly a lot of work to do with automatic image labelling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

We’ve also contacted Google for comment.

The developer, Jacky Alciné, wondered on Twitter how this could have happened.

 Also, apparently when Alciné searched for photos of “gorillas” within his Google Photo library, the problem persisted, with other photos of himself and the same friend populating the results:

Less than two hours after Alciné had sent his original tweet, Google’s Chief Architect of Social then jumped in, tweeting at Alciné to try and correct the problem.

The Googler also offered condolences to Alciné in response to racist Twitter trolls.

Google had created a fix for the problem in about an hour. But the next morning, Alciné told Zunger that two of his photos were still surfacing under the offensive word.

Zunger then said he’d send the issue out for a more permanent fix.

It appears that in the short term, the company will stop using the label “gorillas” as a category. Later, they will figure out how to make their algorithms more sensitive to language that could be offensive.

As the Wall Street Journal pointed out, the Google flub illuminates the shortcomings of facial recognition technology.

“We need to fundamentally change machine learning systems to feed in more context so they can understand cultural sensitivities that are important to humans,” Babak Hodjat, chief scientist at Sentient Technologies, told WSJ. “Humans are very sensitive and zoom in on certain differences that are important to us culturally. Machines cannot do that.”

NOW WATCH: The ‘Tesla of scooters’ is finally available and it looks incredible

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.