Skip to content
Gadgets

Meta Smart Glasses Can Now Track All the Food You Put Into Your Mouth

Whether people want to give Meta all their food data is another question entirely.
By

Reading time 2 minutes

Comments (4)

A new, more prescription-focused style of Ray-Ban-branded smart glasses stole the attention this week, but Meta also quietly announced a few more features for its smart glasses lineup, including… a way to track everything you eat.

According to Meta, owners of Meta’s Ray-Ban AI glasses, or the Meta Ray-Ban Display, will soon be able to snap a photo of what they’re eating using a voice prompt and then log that food item in the Meta AI app. Meta says it will “extract key nutrition details” using AI and said photos. The idea is that Meta wants to use your food pics in concert with AI to give users “personalized insights” and help people make “healthier, more informed choices.”

That process might involve asking Meta AI stuff like “What should I eat to increase my energy?” or other prompts in that vein. One thing that jumped out to me in its explanation of that feature, though, is that Meta has lofty plans to expand that functionality in the future.

Obviously, having to manually log everything is a bit of a pain, and having smart glasses that do the same thing, but in an “ambient” kind of way, would be more convenient. That’s why Meta says that “in the future,” its smart glasses will “understand what you’re eating and automatically log your food.” Sounds great, if you’re into that sort of thing, but there are some pretty major problems with that idea.

For one, I’m pretty sure Meta’s smart glasses would have to be always recording for that to work, and given the way things are going on the privacy front, I don’t think people will be very receptive to smart glasses that record everything all the time. On top of that, making the camera engaged all the time is a one-way ticket to having a woefully short battery life. So, I don’t know… sounds like a good idea in theory, but I’m going to file that idea under “probably not” for now. That’s not even counting the fact that people might be a little more hesitant to hand their data over to Meta right now, even if it’s just the sad sandwich they panic-ate for lunch.

Meta says nutrition tracking will be available on its non-display AI glasses soon, and on the Meta Ray-Ban Display this summer.

Meta also announced hands-free WhatsApp summaries, which will be available in the early-access program “soon,” as well as display recording so you can capture what the screen inside the Meta Ray-Ban Display looks like, which is also coming “soon.” As for features you can use right now: Meta announced the ability to scroll Instagram Reels in the Meta Ray-Ban Display, “glanceable widgets” that show reminders, weather, stocks, and the calendar on the Meta Ray-Ban Display home screen, and a new Spotify shortcut. Neural handwriting, which uses the Meta Ray-Ban Display’s Neural Band to handwrite things using just your fingers, is also set to launch “in the coming weeks.”

Ultimately, there’s nothing groundbreaking here, but as is the way of smart glasses right now, it’s a mix of stuff you’d think the devices would already have and other stuff that feels like it’s a privacy nightmare waiting to happen.

Explore more on these topics

Share this story

Sign up for our newsletters

Subscribe and interact with our community, get up to date with our customised Newsletters and much more.