With all the data that’s constantly sent and received by our phones, there’s been an ever-increasing focus on combating viruses, malware, and other online attacks, to the point that we sometimes forget what’s going on in the real world. But good physical security is still one of the most important things when it comes to protecting your devices, and thanks to a couple members of Google’s research team, your phone might soon be able to tell when others are peeping at your screen from over your shoulder.
During an upcoming presentation at the Neural Information Processing Systems conference next week, Hee Jung Ryu and Florian Schroff are scheduled to discuss their electronic screen protector project which uses the selfie camera on a Google Pixel and artificial intelligence to detect if multiple people are looking at the screen.
According to Ryu and Schroff, the program can recognize a second face in just two milliseconds, and works across a number of angles, poses, and lighting conditions. And while more details will be announced at the presentation, a demo of the software in action can already been seen in the unlisted, but public video above.
To achieve such fast recognition, it seems the team’s program is using TensorFlowLite, Google’s latest venture into AI and machine learning which uses the processor in your phone to perform complex visual analysis rather than needing to ping beefier servers in the cloud.
So the next time you are looking at some sensitive info, whether it’s a document from work, a text from a friend, or even the PIN for your phone, this is exactly the kind of program that could help prevent prying eyes from peering at your info. Now the question is: How long would it take for this functionality to make its way into the greater Android ecosystem—and will it ever?
[Quartz]