Apple Reportedly Wants a 3D Sensor on the Back of iPhones by 2019

Image: Alex Cranz/Gizmodo
Image: Alex Cranz/Gizmodo

After the relative success of Apple’s front-mounted 3D sensor on the iPhone X, which is the component that enables Face ID and the company’s playful Animojis, Apple is reportedly aiming to complement 2019 iPhones with similar tech on their backsides, too.

However, according to Bloomberg, the rear-facing 3D sensors would function a bit differently than the 3D tech found in the iPhone X’s notch. Instead of projecting up to 30,000 infrared dots in order to gauge distance, the new rear 3D might shoot out lasers and then calculate distance based on how long it takes for the light to bounce off objects before returning to the sensor. By shooting out enough lasers, it would be possible to accurately map a room or detect various objects in three dimensions.

This combo would provide Apple’s iPhones with pretty beefy 3D tech on both the front and back of the phone, and could help Apple’s attempts to make augmented reality go mainstream.


Apple only recently released a separate AR platform in the form of ARKit on iOS 11. Since ARKit only requires the iPhone’s camera to function, it seems like more sophisticated 3D sensors could fragment AR app development. I admit, it’s possible that Apple already has a solution in mind to keep these similar, yet different technologies from competing with each other. But if not, Apple could find itself in the same position that Google’s Tango AR platform is currently in: great tech, but a dire lack of content.

Or maybe Apple is just trying to make sure you can turn any face into an Animoji—not just the ones the front-facing sensor can see. With rumors already rumbling about Apple’s trio of iPhones coming next year, the company should have plenty of time to work out any major issues. Now we’ll just have to sit back and wait. It’s going to be awhile.

Senior reporter at Gizmodo, formerly Tom's Guide and Laptop Mag. Was an archery instructor and a penguin trainer before that.

Share This Story

Get our newsletter


This is what Google’s Tango has offered, (although without perfect accuracy), for a couple of years now. It far more useful facing outward, at the environment onto which applications can map useful elements, than constantly staring at the user.