Take one neural network that describes what it sees in an image. Provide it with a webcam feed from the MacBook it’s running on. Then, wander around a city and see what happens. Here are the results of exactly that experiment.


That is, more or less, what Kyle McDonald did to create this video. Using Andrej Karpathy’s “NeuralTalk” code modified to run from a webcam feed, he wandered around the bridge at Damstraat and Oudezijds Voorburgwal in Amsterdam. The video is a little shaky and weird, because McDonald is just wandering around with a laptop pointing it at people, but the results are interesting.

Perhaps unsurprisingly, it doesn’t always get it right—for instance, it takes a while for the software to decided the dude in shot is eating a hot dog rather holding his cellphone, and it occasionally sees things that non sane human would. But that fact that a chunk of code can decipher from pixel color and brightness what’s being shown in an image, like this, in real-time, is frankly amazing.

If you interested in learning more about how neural networks are increasingly becoming part of your everyday online experience, why not read our explainer.


[Kyle McDonald via Charles Arthur via Verge]