This Google Glass App Can Tell You How Stressed You Are

Google Glass can now detect when your stress levels are through the roof, which, let's face it, is decidedly less creepy than having it detect someone else's.

An Glass app called BioGlass, developed by researchers at Georgia Tech and MIT, uses the built-in accelerometer, gyroscope and front-facing camera on Google's wearable device to measure your heart rate, breathing, and tiny movements.


It's still at a rudimentary stage — it doesn't do anything more than record data at the moment. But a future software update could presumably give you some kind of warning sign, play some soothing jazz, or tell you to calm down in Meryl Streep's voice (I swear that does it for me).

To be fair, you don't necessarily need Google Glass to do this. As Engadget points out, any device with the right sensors would work just fine. So just build it into the iWatch, will you, Apple? [Engadget]

This Google Glass App That Measures Human Emotions Is So, So Creepy

It's not like we need any more reminders about how creepy Google Glass can be, but developers never stop surprising us. An new app from Germany's Fraunhofer Institute that uses facial tracking, proprietary tools and Glass, can measure human emotions. In real time.

The technology, dubbed SHORE (Sophisticated High-speed Object Recognition), gauges emotions such as anger, happiness, sadness and surprise and projects this information directly onto the screen of your Glass, right across the face of the person you're looking at. It doesn't just stop there. It also estimates their age and their gender, a feature, Fraunhofer says, can lead to applications in interactive gaming and market research. This is like RoboCop, but real, and on your face. Now.


If there are multiple people in a frame, you will get separate emotional attributes for all of them. All processing happens directly on the Glass CPU, which means that your Glass device is probably going to last you all of 20 minutes as points out.

The researchers at Frunhaufer "trained" SHORE by exposing it to a database with more than 10,000 annotated human faces. They say that beyond the creepy factor, the app could help people with disabilities such as autism, where emotions that the wearer is unable to interpret could be superimposed on their field of vision. Which, I guess, makes it less creepy? Maybe? [GizMag via]