Sure, gesture-controlled robots are cool. Even cooler? Scientists have developed a way to control a robotic arm just by blinking at it—no hands needed.
Researchers at Imperial College London demonstrated this by having a human control a robot arm to paint a picture by only using her eye movements while she ate a croissant and drank a coffee.
Sabine Dziemian, the post graduate student from the demonstration, describes controlling the arm as an “intuitive and easy” experience.
To indicate which color she wants to use, she directs her eyes and blinks. The robot dips its brush and paints strokes as she directs it with her eyes. To change colors, she blinks three times looking at a different color. Another three blinks returns the brush to the canvas.
The final painting looks like something a five year old would create. However, this rudimentary experiment is just the start.
Most of the time when we want to pick up a glass, we look at it to gauge its size, distance from us, and how we will form our grip to pick it up. But robots don’t have the same peripheral sense as us.
Dr. Aldo Faisal, one of the researchers, says that they’ve been working on this for six years now to develop systems that decode our intention of action from our eye movements. Developing algorithms that take our eye movements and translate them into actions.
Faisal says the next step is to control the whole body of the robot with eye movements so that everyone can multi-task.
“Imagine, for example, that you can paint and eat and drink at the same time, imagine holding a baby and preparing its food while you do it all simultaneously. So there are whole new ways we can think about interacting with the world,” Faisal told Reuters.
This isn’t the only robotics project working on eye control. Earlier this summer, programmer Gal Sont demonstrated how to move a telepresence robot and make it talk by only using eye movements.
Gal Sont, the programmer, suffers from ALS. The software allows him to go outside and comfortably converse with people via the robot–something not all ALS patients have the capability to do–gives him back some of the freedom the disease took away.
This tech could be especially handy for people who suffer from debilitating conditions like multiple sclerosis or Parkinson’s Disease acting like an extra pair of arms and hands.
Let’s just hope robots don’t learn to decode what we’re thinking by our eye movements.
Image via Imperial College London/YouTube