It starts with the addition of a plastic film over a touchscreen that’s covered in a grid of small bumps measuring just five microns high. The bumps are invisible to the human eye, and barely perceptible as a finger moves across the applied film, but like scratching a fingernail across one of those lenticular images that appears to move as you change your viewing angle, swiping a finger tip or stylus over the pattern of bumps produces a subtle acoustic vibration that often falls in the ultrasonic range outside of human hearing.

Advertisement
Advertisement
Image for article titled A Bump-Covered Screen Protector Can Surprisingly Make Touchscreens React Faster to Swipes

A user’s ears can’t hear the vibrations produced, but a microphone can, and the researchers are able to capture the sounds produced and determine the speed and direction of a touchscreen interaction with a latency of around just 28-milliseconds. What they can’t determine from the vibrations is where a finger or stylus actually makes contact with a touchscreen, so they use that acoustic data along with the more accurate positional data captured by the capacitive touchscreen itself to feed a machine learning model that can predict where a finger tip or stylus is moving, and update a device’s user interface to reflect that prediction.

This method isn’t always 100% accurate, but it’s accurate enough to make touchscreen devices feel far more responsive and reactive, with perceived latency reduced to just 16-milliseconds and distance errors reduced to around just five-millimeters. As far as a user is concerned, the device they’re using suddenly feels snappier, and while moving around icons isn’t the most exciting application, this research could potentially vastly improve the simulated pen on paper experience, allowing artists and devoted note takers to more naturally use their devices.