Like babies, neural networks need to be trained on what to look for when analyzing the world. This usually involves feeding it a data set that includes hours of footage, or thousands of photos, with corresponding data describing each. For this research, the CSAIL researchers trained their neural network on photos of people doing regular activities like walking, sitting, and talking.

Advertisement

The AI was then taught how to generate stick figure skeletons representing the poses and movements of the people in the photos, and then how those skeletons corresponded and matched up to the measurements of scattered radio signals. Eventually the neural network was able to generate those skeletons by analyzing just the scattered radio signal data, which, it turns out, can easily pass through walls when light can’t. It’s X-ray vision without the need for blasting harmful X-rays.

The skeletal representations of hidden humans are definitely still on the crude side, but the CSAIL researchers are working on generating 3D representations that include subtle and small movements. One of the practical applications of their research could be in hospitals or nursing homes, where the movements (dangerous falls) or symptoms (shaking hands) of patients could be tracked without the use of intrusive video cameras that don’t work in the dark anyways.

Advertisement
Advertisement

MIT’s AI was even able to accurately identify someone based solely on their movements 83 percent of the time when trained on a group of 100 different people. As a crime-fighting tool, the technology has a lot of interesting potential, too, all but eliminating the efficacy of criminals wearing ski masks or trying to use the dark of night to hide their identities. But it also has scary implications when it comes to privacy, as invisible radio signals are everywhere, and it seems all but impossible to prevent someone from using them to track you against your will.

[MIT News via Taxi]