The touchscreens used on smartphones, tablets, and wearables are leaps and bounds better than similar technologies from a few decades ago, but there’s still lots of room for improvement, as researchers have demonstrated by leveraging a bump-covered screen protector to dramatically reduce a modern touchscreen’s latency.
As responsive as a touchscreen may seem, there’s actually a slight delay between when you tap a screen and when a device’s user interface reacts to the gesture and updates the on-screen graphics. It’s almost completely imperceptible when you’re just performing quick taps, but it becomes more obvious with longer gestures like swipes. Grab an icon on your smartphone’s home screen and quickly drag it around, and you’ll see its position on screen trail behind where the tip of your finger actually is. The delay is also very much noticeable when drawing or writing on a touchscreen using a stylus, which is why few devices come close to perfectly recreating the pen on paper experience which instantaneously leaves strokes behind.
Touchscreen latency will improve over time (most smartphones exhibit a latency of around 80-millseconds today) but researchers from Carnegie Mellon University’s Future Interfaces Group have come up with a clever shortcut for improving latency right now without actually modifying a touchscreen’s hardware.
It starts with the addition of a plastic film over a touchscreen that’s covered in a grid of small bumps measuring just five microns high. The bumps are invisible to the human eye, and barely perceptible as a finger moves across the applied film, but like scratching a fingernail across one of those lenticular images that appears to move as you change your viewing angle, swiping a finger tip or stylus over the pattern of bumps produces a subtle acoustic vibration that often falls in the ultrasonic range outside of human hearing.
A user’s ears can’t hear the vibrations produced, but a microphone can, and the researchers are able to capture the sounds produced and determine the speed and direction of a touchscreen interaction with a latency of around just 28-milliseconds. What they can’t determine from the vibrations is where a finger or stylus actually makes contact with a touchscreen, so they use that acoustic data along with the more accurate positional data captured by the capacitive touchscreen itself to feed a machine learning model that can predict where a finger tip or stylus is moving, and update a device’s user interface to reflect that prediction.
This method isn’t always 100% accurate, but it’s accurate enough to make touchscreen devices feel far more responsive and reactive, with perceived latency reduced to just 16-milliseconds and distance errors reduced to around just five-millimeters. As far as a user is concerned, the device they’re using suddenly feels snappier, and while moving around icons isn’t the most exciting application, this research could potentially vastly improve the simulated pen on paper experience, allowing artists and devoted note takers to more naturally use their devices.