If you’re paying hundreds of dollars extra for a Pro or Pro Max version of the iPhone 12, then you want to make sure you’re getting your money’s worth—and taking advantage of the extra photo-taking capabilities is a key part of that. Here are the features you get, and how to get the most out of them.
The iPhone 12 Pro and the iPhone 12 Pro Max have three rear cameras compared to the two on the iPhone 12 and the iPhone 12 Mini. The extra lens is a telephoto one, which means you get 2x optical zoom on the Pro and 2.5x on the Pro Max (for a 4x and 5x total optical range, respectively).
It’s worth noting that the Pro Max has a slightly better setup than the Pro, aside from the greater optical zoom range: Its main 12MP sensor is larger, so more light can be captured and low light photography is improved, and the sensor itself adjusts its position to counteract camera shake, rather than having this fixed via software afterward.
There’s also a LiDAR scanner on board, a technology we’ve gone into more detail about here. Essentially, it’s an upgrade for your iPhone’s depth-sensing skills, so augmented reality apps are more precise and camera autofocus can pick out a point in space and fix on it faster (especially in low light).
Most of the time, the benefits that you get from the iPhone 12 Pro and iPhone 12 Pro Max cameras are applied automatically in the background: there are no settings buttons to toggle or features to enable. However, it’s still worth knowing what the key advantages are, and the situations where you’ll see the biggest improvements.
You’ll notice an extra zoom option when you load up the Camera app on the iPhone 12 Pro and Pro Max models, compared to the other iPhone 12 variations. Instead of just a 0.5x and 1x option you’ll see a 2x option for the iPhone 12 Pro and a 2.5x option for the iPhone 12 Pro Max.
The benefits of the extra optical zoom are obvious: Even if you can’t physically get closer to whatever it is you’re shooting, whether it’s the stage at a music gig or a bird in your backyard, the iPhone camera can help. The built-in image stabilization on the iPhone 12 Pro Max should prove useful at higher zoom levels too.
It’s worth noting that the longer telephoto lens on the Pro Max means that it collects less light. If you’re using the 2.5x zoom in low light, the native Apple Camera app will switch to cropping a photo that’s shot with the main lens to keep noise levels low while maintaining the zoom effect — if you want to keep on using the 2.5x telephoto zoom lens for whatever reason, you’ll need a third-party app that gives you full control (like Halide, whose developers have explained the camera switch).
The usual rules for shooting while zoomed in apply—keep the camera as steady as you possibly can, using a tripod or some other support if possible. The improved shutter speed and anti-shake technologies in the iPhone 12 Pro and iPhone 12 Pro Max should help here, but you can also take steps to ensure you get a great shot each time.
Every iPhone 12 model is very capable when it comes to shooting in low light, but with its larger sensor, it’s the iPhone 12 Pro Max that’s going to get you the best results. There’s no dedicated night mode in the iOS Camera app, but it should turn itself on automatically when you’re shooting something in low light conditions.
You’ll see a yellow icon appear on the shutter screen with night mode activated, and Apple’s software will choose an exposure time based on the scene that you’re looking at (this will be shown on the night more icon itself). The longer the exposure time the more light can get into the shot, but the longer you’ll need to keep the camera steady (again, the Pro Max’s self-correcting sensor can make a difference here).
If you want to let in as much light as possible and are confident in your ability to keep the phone still—or you have a tripod—then tap on the yellow night mode icon and you’re able to adjust the exposure time manually, up to a maximum of three seconds. In some cases, it may be worth overruling the Camera app to capture more detail.
Switch to the Portrait mode using the selector next to the shutter button and you can get some really well-judged background blur effects thanks to the advanced depth-sensing that LiDAR does: It’s better able to identify the edge between subjects and background through laser scanning, as demonstrated by Ben Lovejoy at 9to5Mac.
We’ve already mentioned how LiDAR can help in night photography and portraits, and its superior depth sensing is one of the main ways it’s going to improve your photos on the Pro cameras without you actually having to do anything—it’s faster and more accurate than the ToF (time-of-flight) sensors included on a lot of other handsets.
You’ll see that in improved Portrait shots and faster focusing in low light. However, the main benefit of adding LiDAR to a phone camera is not to improve the pictures you can get, but to add extra speed and accuracy to augmented reality applications. At the moment there aren’t a huge number of AR apps you can install to test this out, but over time we’ll see more and more appear.
The Measure app comes with your iPhone and offers a quick and easy way of testing out AR, though the differences in speed and accuracy might not be immediately apparent. Point the Measure app at a person—making sure their full frame is included in the camera viewfinder—and after a second or two you should see the app give you an estimate of their height, thanks to the integrated LiDAR.
Apps such as Scandy Pro 3D Scanner, RoomScan LiDAR, and Snapchat can also give you an idea of how well the LiDAR scanning works on the iPhone 12 Pro and Pro Max—Snapchat’s developers have promised an AR filter that specifically takes advantage of the LiDAR on Apple’s most expensive 2020 iPhones, though we’re still waiting for it to roll out.
The arrival of the Apple ProRAW mode is a big upgrade for iPhone 12 Pro and Pro Max users, offering more control for serious photographers. RAW format photos have been available on phones for years now, minimizing any processing or effects added by the software and giving users access to the ‘raw’ data captured by the image sensors, though in the case of ProRAW Apple is trying to give us the best of both worlds.
That means it applies some iPhone processing magic to the picture—like Deep Fusion and Smart HDR—while still giving photographers as much flexibility as possible when it comes to adjusting white balance, tone, color, and so on in an image editor. “ProRAW gives you all the standard RAW information, along with the Apple image pipeline data,” explains Apple.
As we’re writing, ProRAW has just arrived in the iOS 14.3 beta, though it may have rolled out fully by the time you’re reading this. To enable it, you need to head into Camera then Formats from iOS Settings: Once that’s done, you’ll see a RAW button up at the corner of the screen when you’re taking snaps, which you can tap to toggle ProRAW on and off.
The feature has only just appeared, so we’re waiting to see exactly what difference it makes to photos and to workflows. You can actually capture images from an iPhone in a more conventional RAW format, you just need a third-party app to be able to do it—VSCO, Darkroom, and Snapseed are among those with the necessary functionality.