Although there’s been a push to make video game controllers more adaptable and accessible to a wider audience who may not be able to operate a traditional gamepad as it’s designed to be used, Google is working to take physical controllers out of the equation entirely. Instead, its Project Gameface allows games to be played using nothing but simple head movements and facial gestures.
One of the announcements made at yesterday’s Google I/O 2023 that wasn’t part of the main stage demos was the cleverly-named Project Gameface, which, as the name implies, focuses on a gamer’s face as a control mechanism. While products like Microsoft’s Xbox Adaptive Controller completely reimagine the gamepad with uniquely-shaped and over-sized buttons and controllers designed to be used with more than just fingers for those with limited mobility, Project Gameface is instead designed for gamers like Lance Carr, who’s a streamer living with muscular dystrophy: a disease that progressively breaks down and weakens skeletal muscles over time.
Head-tracking mice are not a new idea, but they mostly rely on big head movements, or tracking eye blinks. That’s what Carr relied on until he lost all of his gaming gear in a house fire: a tragedy that had at least one silver lining, as engineers at Google decided to step in and not only help replace his head-tracking mouse, but work to greatly improve it.
Project Gameface is the result: an open source hands-free gaming mouse that relies on an off-the-shelf webcam pointed at the user’s face. A handful of machine learning models track 468 unique points on the face, allowing GameFace to accurately detect not only movements of the head, but deliberate facial gestures too. These can then be translated to mouse clicks or other shortcuts. To be as accommodating as possible, Project Gameface even allows the size of a gesture to be adjusted, so those only able to make subtle face movements can take advantage of it as well.
Google says Project GameFace is still in development and not quite ready for primetime just yet, but has made it available through GitHub for those wanting to try it out or contribute to the ongoing work on it.