Google Photos App has recently showed a blunder which proved to be offensive for a computer programmer whose name is Jacky Alcine. Jacky was shocked to see what google app had put a tag on him which racially disgraceful for him. Jacky with one of his friends was tagged by the facial recognition software with the word ‘Gorillas’ as both of them are African-Americans. Jacky Alcine soon after the insulting racial act from the software called out Google and let the officials know about that.
He tweeted about the incident with anger and asked Google that what kind of sample was collected by Google that made it happen with their photo.
Yonatan Zunger noticed the Tweets and reduced the rage
Yonatan Zunger read his tweets who is a senior engineer at Google. Actually his account was linked to a Google+ blog of a senior engineer who had the similar name. That came into the knowledge of the chief architect of the Google + who was very unpleased with what happened to Jacky Alcine. He said that this shall not be tolerated and the problem due to which such thing occurred will be rectified as soon as possible. He realized the fault of such big company and tweeted to Alcine”Sheesh. High on my list of bugs you never want to see happen”.
It was announced by Yonatan Zunger that Google will not be using the label of Gorillas again in its face recognition software. Google has also considered some improvements which need to be done in the software so that next time when it label the photos of the users, it would not create any offensive kind of labels which could consequently spoil the image of the Google who has a big name in the market as people would not trust Google and its app of such nature again.
This is not the First Mistake made by Software
Zunger cleared in his tweet that they are working on it and many things have been corrected. He identified the problem of the software which it is having with the obscured faces and different types of skins tones and lighting as well. The software had the problem in which it used to tag people with all races as dogs.
Aliced was pleased with the kind of response he got from Zunger, he thanked him as well. He was convinced with the past example of fault made by the program where the dogs of a user were tagged as horses. Zunger said that it is not targeting any particular race, it is just an ordinary error which is happening in the machine learning process but he also admitted that the due to the racism, this ordinary error took the shape of a big disgrace which had hurt the feelings of the user.