Flickr's Image Recognition Tool Is Making Some Embarrassing Errors

Illustration for article titled Flickr's Image Recognition Tool Is Making Some Embarrassing Errors

Well, this is awkward. Flickr’s seemingly impressive image recognition system is making some embarrassing slips when identifying black people and concentration camps, according to the Guardian.


The newspaper explains that the new algorithm, which is designed to tag and then filter images by content, is “misfiring frequently.” It describes how a portrait of a black man named William got auto-tagged as “blackandwhite” and “monochrome” along with “animal” and “ape.” (Incidentally, a picture of a white woman also got the same treatment.) Elsewhere, pictures of the Dachau concentration camp were tagged with “jungle gym” and “sport”, while one of Auschwitz was also tagged as depicting “sport.”

But it’s worth pointing out that such mistakes are natural—and useful. Flickr uses a machine-learning approach to identify images, comparing new pictures to ones that it’s seen in the past to try and identify what they show. As a result, it won’t always get them right, so an important step is for users to delete inappropriate tags so that it can learn from its mistakes. In other words, it can learn faster by making a few mistakes and then having them corrected.

So, yes, the slips that Flickr’s algorithm has made are embarrassing. But they’re also going to make it less awkward in the future. [Guardian]


Zelda did it!


(oh, it also happened to a white woman, but that’s not generating the clicks we want and so desperately need)