Image recognition algorithms are nothing like our eyes, and here is blobby static proof. These images fooled an algorithm into seeing a gorilla, bikini, stopwatch and more—yet they obviously resemble nothing of the sort to human eyes.
What these squares of static reveal, of course, is that a deep neural network trained to recognize images looks for very different things than a human eye. We can look at a photograph of chair and identify its legs and back and cushion. A neural network that's been fed dozens of pictures of a chair will look for patterns in how the pixels are arranged. How a computer "sees" a chair is utterly distinct from how a human brain sees a chair.
And so you get this static. In a study uploaded to ArXiv, and reported by MIT Technology Review a trio of computer scientists created these static images by using a neural network called AlexNet. Here's how they got AlexNet to first create and then try to identify these images, as explained by Tech Review.
They operated it in reverse, asking a version of the software with no knowledge of guitars to create a picture of one, by generating random pixels across an image. The researchers asked a second version of the network that had been trained to spot guitars to rate the images made by the first network. That confidence rating was used by the first network to refine its next attempt to create a guitar image. After thousands of rounds of this between the two pieces of software, the first network could make an image that the second network recognized as a guitar with 99 percent confidence.