So there have been these rumors of a larger iPhone. An ‘iPhone Math' is what some are calling it. Worthless, right? I rather hope not. As Marco has suggested, maybe this iPhone Plus is real, and the die-shrunk A5X is heading for this device. I'm going to assume that his wild stab in the dark about the device's existence is correct.
The notion of a 5-inch iPhone seems insane and unlikely, but hey, remember when we said the same thing about a 4-inch iPhone? The Tech Block's Jason Paul Richmond crunches some serious numbers. Pay attention. This could be your next iPhone.
To bolster Marco's point of where the iPhone Math fits into the lineup of the competition, I have to call upon my unscientific and anecdotal observations: My dad acquired an iPhone 4S early last summer. And he loves it. He also started texting for the first time. My dad, at the age of sixty, is texting. But there is a problem. He's finally having enough trouble with his eyes that he can't read texts on the 4S without zooming in, making texts hard to read in a totally different way. After being shown the Galaxy S III, his words were something to the tune of, "I'd love to have a larger screen."
I know, I know - one-handed operation is, as they say in common parlance, where it's at. But does anybody use the iPhone one-handed exclusively? And there is a case to be made that if you can't use two hands, you probably shouldn't be using the phone anyway. So this argument, while very sticky to those who are old pros at one-handed operation, falls flat in my casual observation of ordinary folks actually using iPhone's, let alone these giant-screen competitors.
For a sizable portion of the population, the benefits of the larger-screen phones seem to trump its failings - it's simply a better choice. Hence, the popularity of enormous Android devices. Some of that is from a lack of choice—but I think that lack has to do with other manufacturers looking for an edge, because they've had trouble competing against Apple at that size. A near five-inch iPhone could eviscerate the competition. Even a screen at a lower pixel density might do to the regular, old iPhone what the iPad Mini seems to be doing to the iPad - be "inferior" in crucial ways and still sell more.
But here's where Marco's blatant speculation somewhat affirmed by both Gruber and Dalrymple falls apart for me. Why would they offer a larger-screen phone with a 1136 x 640 resolution - by far the lowest resolution in its class, one-third the pixels of the 1080p displays that are now hitting the market - unless they were selling it for cheaper? Given the demand for large-screen smartphones, why would Apple want to sell if for cheaper? And why would they throw the A5X in there when it probably costs as much as the better A6 to fabricate, and the plain old A5 would do just as well?
A different kind of iPhone math has been used to calculate the Retina display involving both pixel density and typical viewing distance. But I have a problem with that math, namely, the implicit assumption that 300 ppi (pixels per inch) viewed from a foot away is some kind of magic number beyond which the human eye with "perfect" 20/20 vision cannot perceive individual pixels. 20/20 is not the bastion of clarity it's been made out to be. It just means that you can see from twenty feet what the typical person can see from twenty feet.
I know a lot of people need glasses, but reasons for corrective lenses are varied. Some are farsighted, some are nearsighted, some have an astigmatism, some have just gotten older. My brother needs them because he develops a lazy eye and headaches without them - he can see farther with them off. Regardless of the reason, they have been corrected, hopefully at least to that 20/20 mark, if not better. So why would Apple settle on 264 ppi from 10 to 14 inches away, when the majority of people with normal vision, wearing glasses or not, would be able to discern individual pixels? Not to mention the fair number of us who have better than 20/20 vision and aren't satisfied with 326ppi screens.
Boosting the pixel density at a given size means boosting the resolution, which poses problems. Some believe that Apple should just adopt the industry standard 1080p. I don't think that would be a bad idea per se, especially if the Apple TV could run apps, but it can make things even more complicated for developers - complications that have the potential of devaluing iOS's greatest asset, the quality of third-party software. Adding complexity to a system means time will have to be spent dealing with such complexity, which means less time to add polish to existing apps and less time to create new ones, possibly slowing the virtuous cycle of the App Store. Plus there are bandwidth costs and memory constraints associated with bloated universal binaries. Add to that the extra complexity in the supply chain - you start to see why Apple has been so conservative about screen sizes.
So if they were to take this leap what route would be best for Apple? When they first did this with the iPhone 4, they doubled the resolution, quadrupling the number of pixels in all previous iPhones. [Note: Apple made it really convenient for developers to specify Retina graphics from non-Retina graphics by adding ‘@2x' to the name of the file (as in firstname.lastname@example.org). So from here on out, I'll use @1x to specify the pre-Retina resolution, @2x for Retina, and @[number]x where that number is a multiple of the pre-Retina resolution, which you square to figure out the number of pixels that would fit per pre-Retina pixel.]
Doubling the resolution from @2x Retina to @4x for Retina+ seems like the easiest thing to do. A 2272 x 1280 resolution near five-inch display would be great, but I think it's a ways off. While the A5X and A6 could handle the load, would they be snappy enough? And what about the battery life and the heat dissipation? Then there is another problem with doubling the resolution, one which makes apps and websites that don't serve up hi-dpi images (like the following graphics) look terrible - a problem that merits a little examination.
So let's say you have a dot. It takes up one pixel. I'm going to assume that pixel is one found on an RGB stripe display, with three sub-pixels of red then green then blue. Our eyes are most sensitive to green light which makes up about two-thirds of white light. So with the green in the middle being the brightest flanked by the not-quite-as-bright red and blue, it creates an illusion of a pinprick, a single, round dot of light. Double the resolution, and that dot now takes up four pixels. Four pinpricks of equal intensity where the lower resolution only had one:
Notice how that dot now looks like more like a square? That's where the hyper-jagginess of low resolution graphics that are simply pixel-doubled come from. That's why surfing the web on a Retina MacBook Pro is so bad at the Best for Retina setting.
So what could Apple do to up the resolution without the jaggies and the potential hit to performance and battery life? Simple. If we count by squares, the next step isn't sixteen; it's nine. @3x come before @4x. With nine pixels in place of four, we have only 125% more pixels to drive instead of the 300% more in the @4x scenario:
But - you interject - this yields the same problem as the intermediate resolution of @1.5x Apple passed on before going to double the resolution. See @2x to @3x runs into the same problem as @1x to @1.5:
Yes, at @2x raster graphics would get a little fuzzy with the anti-aliasing, but any softness in @2x graphics would already be so small they would hardly be noticeable, while the things that really need to be sharp - text and photos/videos shot on the device - would improve greatly. That's not the reason Apple didn't muck about with that middling @1.5x (which would have been 720 x 480). The gymnastics required to support a 50% better resolution were simply too onerous for reasons that go deeper than the mere fuzziness of anti-aliasing - it was not a multiple of the original resolution.
When Apple doubled the resolution the first time, they did something absolutely brilliant:They left the point sizes the same. In iOS, when you draw to the screen, you don't draw @2x - you draw in points that equate to one pixel in pre-Retina displays and four pixels in the current displays. On the iPhone 4, your target is the same as on the original iPhone: 480 x 320. On the iPhone 5, that target is 568 x 320. That's exactly what the target would be for this hypothetical 1704 x 960 screen. All text and other vector objects would be drawn @3x automatically. All @2x raster objects could just be scaled by 1.5 and anti-aliased - and would still look great. I don't even think that @3x assets would be necessary or even desirable given the amount of work required to generate them and the valuable space the files would consume.
Displaying @2x assets better than @2x displays @1x is not the end of the @3x advantage. With some tweaking of the scaling algorithm, you can fix the hyper-jagginess of the low-resolution graphics by blending the corner pixels of a single point with the adjacent corner pixels. The @1x graphics would then be an improvement over the original boxiness, even if they look softer.
The pros of @3x near five-inch iPhone:
- Less jagginess, possibly even for original resolution graphics
- Devs don't have to do a thing, not even upgrade their @2x assets
- Websites don't have to offer up @3x graphics
- Binaries don't balloon
- Small text is far more readable
- Pictures and HD movies look awesome(r)
- Touch targets are larger and therefore easier to hit, especially if you have large hands
- It approaches a genuine retina display (4.94" at 396ppi, 4.8" at 407ppi, 4.5" at 435ppi, the current 4" would be at 489ppi)
- More room for the battery and antennas
The cons of @3x near five-inch iPhone:
- Worse one-handed operation, unless you have large hands
- You may have to buy not-as-skinny skinny jeans or carry a handbag
- Another device in the supply chain vying for parts
- Battery life might suffer, but probably not
- Apple might have to go against what they said previously (gasp!)Apple would be perceived as "reactionary"
Re that last con - If making the best product on the market is reactionary, then yes.
One last note: A few years ago it was the iPad and Apple TV in the spring, the iPhone in the summer and the iPod lineup in the fall with Macs disbursed randomly throughout the year. Now Apple has gotten everything on the Christmas cycle - which means that every device they make will be competing with every other device they make for parts and for scale. The end of 2012 was just crazy. As impressive as it was, I would prefer Apple didn't revamp every single product just before the winter.
What would make a great amount of sense is if Apple were gearing up to do a springtime release of the 9.7″ iPad and this 4.x″ iPhone+ to go head to head with the Galaxy S and HTC One lines. Use winter and summer primarily for the release of new Macs and OS X, spring and fall for the release of iOS and devices. Two sizes of iPads and iPhones released in a big-little cadence, each iterating and improving on the other. That sounds like a great way to roll…
The TechBlock carries select tech-related content that's produced in-house or hand-picked from user submissions that meet our criteria. To publish with us or to learn more about the publishing process, visit our publisher page.