I am not on board with 8K. The TVs will be expensive, there’s zero content for them, and they’ll heavily rely on internal processors for upscaling that already struggle to upscale HD content properly to 4K. It seems smarter to work on HDR tech, which makes a more substantial improvement at this time than higher resolution. 8K feels less like new tech to be excited about, and more like flashy language someone in marketing is hoping will help a company sell a few more TVs. But during a closed-door briefing at CES last week, Sony attempted to make a case for why 8K should be the future of televisions, and it made some sense.
For Sony, there are two issues at play here. First, there’s the imperative to make content look as close to what it looked like when it was edited and graded on a reference monitor. Second, there’s the need for the content to look “real.”
Film and TV editors and even the workers handling live broadcasts for sports and award shows use reference monitors, which are much smaller than our TVs but much higher in quality. They display more color, have none of the irritating issues our TVs do, like banding, pixelation, or halos, and they’re sharp as hell.
I’ve had the opportunity to look at content on reference monitors versus a wide variety of TVs on multiple occasions, and the experience is a shock. The picture is so good it makes me conscious of how bad every TV I’ve used before is by comparison.
Sony thinks TV can get as good as a monitor by upping the number of pixels, and thus the pixel density. A 30-inch OLED reference monitor from Sony will have pixel density of approximately 156 pixels per inch. (This and all other pixels per inch numbers in this piece are approximate and calculated using this web app.) You have to get up close to see those pixels, but most of us will sit further back and just see a nice sharp image with dreamily rendered colors. It’s not as pixel-dense as an iPhone XS (458 PPI) or the MacBook Pro (227 PPI), but it’s better than a Sony 65-inch 4K television which has just 71.79 pixels per inch. A great Sony TV has less than half the pixel density of that monitor.
And you do notice that lesser pixel density—especially in places where you’re meant to see a smooth gradient of color, such as sunrises, sunsets, and areas where light casts a long shadow. There’s often inescapable pixelation and banding. You can see the step up or down in color and light from one pixel to the next.
Sony thinks that can be solved by upping the resolution. Its smallest 8K set will still be a whopping 85-inches, but it will have a pixel density of 104.16 PPI. And if that 8K magic trickled down? A 65-inch set would be 136.61 PPI, and a 55-inch set would have a higher pixel density than the reference monitor. Theoretically, this could mean that these TVs look better than what’s on your TV stand.
The problem is that while Sony and other 8K TV makers may be rapidly approaching the pixel density of monitors, they are missing other features that contribute to their gorgeous images. Like the brightness—a reference monitor like Sony’s BVM-x300 v2 has a peak brightness of 1,000 and, being an OLED, controls the brightness level pixel by pixel. OLED TVs struggle to hit 800 nits, and do not have as fine control, which leaves dark scenes looking darker than they should. LED TVs can sometimes hit 1,000 nits but don’t have nearly as fine control, which leads to haloing around bright objects in a dark scene.
This lack of more granular pixel by pixel control leaves TVs looking dimmer, duller, and just not as good as a reference monitor, and no amount of pixels can fix that. But Sony seems to think boosting pixels is a step in the right direction, not just because it’s closer to the pixel density of a reference monitor, but because it provides a more “real” image.
And this is where things get complex. Here we turn to a 2013 paper describing a study by a group of NHK engineers. The NHK is Japan’s public broadcasting organization—think PBS or the UK’s BBC—and it is heavily involved in creating broadcast standards, encoder types, and even the very first HDR standard for broadcasting, Hybrid-Log Gamma (it created it jointly with the BBC). NHK’s engineers posited that there is a distance where an object on a TV, under optimal lighting, will appear as real to the human eye as the actual object. In the study, images from different sets were presented to 82 observers through a contraption that normalized the picture for size and other factors and asked them to compare the image to the real thing. The results suggested that the distance for content to appear “real” is very different from another more commonly discussed distance—the distance at which the pixels on a TV can no longer be viewed by the naked eye.
You’ve definitely heard of this concept—I cited it when complaining about 8K televisions last week. There is a limit to the visual acuity of your eyeballs. In practice, this means that you will not be able to perceive improvements in screen resolution beyond a certain distance. For example, if you sit any further than about 7 feet from a 65-inch 4K television, then it starts to become indistinguishable from a 1080p set. The distance changes based on the width of the TV and the resolution of the TV, which is why there’s another less discussed number to describe it: The optimal viewing distance in image heights. For a 1080p display, it’s 3.1 H or the height of the display times 3.1. For a 4K display, it’s 1.5 H, and for an 8K display, it’s .75 H.
Despite this commonly accepted wisdom, the NHK paper’s research suggests that the distance for a more “real” image on an 8K display isn’t .75 H but 1.5 H, the same as a 4K set for visual acuity. Which means if NHK is right, an 8K TV in the same space as a 4K TV should feel more “real” when sitting the same distance from the TV. (This ITU paper on the state of UHD TV has one of the easier to grasp explanations of “realness” in TV.)
I say “should” because I haven’t seen a perfect demonstration of this theory as there are other factors besides resolution which affect picture quality. A ball on the screen won’t look as real as one in real life if the display doesn’t pump out the same quality of light, or distorts the color.
Citing this research, Sony makes an appealing argument for why 8K could be good, but there remains the content problem. There’s no 8K content, and those reference monitors Sony is striving to emulate will still be 4K while the new sets will be 8K, which means they’ll rely heavily on upscaling to render the 4K content on twice the number of pixels. There is almost no native 8K content available, and none planned as of yet.
Sony, Samsung, and the other TV makers who are launching 8K sets this year think their upscalers are up to the task and can render a 4K picture on an 8K set as beautifully as if it were the native resolution. Having only really seen these sets with 8K content, or cherry-picked 4K content, I can’t safely say if those claims are accurate. But in that closed-door briefing, Sony did its job. Its theory is sound enough that I don’t want to write 8K off as a fad like 3D. There could be something there. But we won’t know until all these 8K sets start shipping later this year.