Apple’s first-ever pair of smart glasses is most likely coming at some point this year, but what they’ll actually look like is still a pretty open question. Cameras? Most likely. Open-ear audio? That’s a no-brainer. AI? It’s 2026 after all. If there’s one thing that I’m pretty certain they won’t have, it’s support for first-of-a-kind camera-based hand gestures, though a new and somewhat dubious rumor would suggest otherwise.
According to MacRumors, an “inside source” is suggesting that Apple will use one of the two on-device cameras for a kind of hand-tracking that can read gestures and then translate them into inputs on the smart glasses. While the idea of hand-tracking already exists on devices like the Meta Quest 3 and Vision Pro, putting that technology in a form factor as small as smart glasses would be novel. Because of that boundary-pushing nature, the rumor is attracting skeptics like Bloomberg’s Mark Gurman, who has already published his own reports on what to expect from Apple’s glasses.
As Gurman notes, there are a few issues with putting hand gestures in a pair of smart glasses, but two of the biggest are the processing power it would take as well as the strain on battery life. Smart glasses, as I mentioned, are small, so power and battery are already two of the biggest bottlenecks. As tempting as it would be to port over features from larger XR devices like the Vision Pro into a pair of smart specs, in practice, we’re still a ways off from being able to do that.

Hand gestures (even simplified ones) qualify as Vision Pro-like features in this case, and if that input method comes at the cost of battery, you can bet on Apple looking elsewhere for its control mechanics.
And even if Apple could make hand gestures work without sacrificing battery life, there’s probably little incentive to do so right now. If Apple’s first part of smart glasses is sans display, as early reporting would suggest, there’s no need for a sophisticated input system like the Neural Band that Meta uses on its Meta Ray-Ban Display. Trust me, I’ve used quite a few pairs of non-display smart glasses at this point, and touch controls work just fine.
So, as tantalizing as that idea of a first-of-its-kind feature, I’m going to side with Gurman on this one and file that rumor under “probably not.” And that’s fine. With the right look and iPhone compatibility, Apple’s first pair of smart glasses might not need anything novel to stand out.