The technology powering the display on your phone, or even your TV, is a lot different than it was even ten years ago. More colors, more pixels, and a whole lot more acronyms and complex terms that mean something—even if you have no idea what that something is. Display technology in 2017 is a complicated business, but if you understand some basic concepts and a few of the acronyms everything starts to be about as clear as that sweet iPhone display you might be reading this on.
There are plenty of details and differences, but at the most fundamental level you’ve got two competing ways of building a display. Once you understand the basics of OLED versus LCD things begin to make a lot more sense.
Whatever type of gadget-with-a-screen you come across, from phones to TVs, it’s either going to have a display based on OLED (Organic Light-Emitting Diode) technology or on LCD (Liquid Crystal Display) technology.
LCD uses a an extra backlight layer to illuminate the pixels. The light source is usually LEDs. On cheaper displays, often found in TVs and laptops, the LEDs are located only on the edges of the display, leading to brighter edges and a slightly dimmer center. Most LCD-based phones and pricier TVs are “full-array” backlit. The LEDs are across the entire back of the display, cutting down on dim spots and often giving you better contrast and control of light levels.
OLED takes things once step further. The pixels in an OLED screen illuminate themselves once an electric current is passed through. As a result, the brightness of an OLED screen can be controlled on a pixel-by-pixel basis. That makes it great for mobile devices, where less lit pixels mean less energy consumed, and it also makes it good for high end TVs, where better control of light means a more realistic rendition of scenes.
At this point you’re probably just wanting to know which is better, but there’s no easy answer—manufacturers stick their own proprietary technologies on top of these displays, and so an OLED display doesn’t necessarily beat an LCD display or vice versa. However, we can talk a bit about the advantages and disadvantages of each display type.
Typically, OLED screens are known for faster response times and better contrast—black OLED pixels are completely switched off, whereas black LCD pixels are technically just dimmed as far as possible. Displays using OLED technology have wide viewing angles and a wider color gamut, and they also open the door to curved and flexible displays (which is why Apple will have to make the switch if it ever wants a curved iPhone).
You might notice you don’t see OLED used in many monitors yet—that’s because manufacturers are struggling to get the yield rates (the number of working panels versus the number of failed ones) high enough to make them cost effective. That doesn’t matter so much in phones, where the displays are smaller and cheaper, or television sets, where high prices mean the costs can be recouped.
There’s also the issue of static images getting ‘burned’ into the display, a traditional drawback to OLED technology. Smartphone screens usually turn off after a few seconds, and TV screens show images constantly on the move, but it’s still a problem for monitors that the manufacturers are trying to overcome.
On the other hand, LCD screens—broadly speaking—have a lower power draw when showing colors, are often sharper, are easier to see outdoors, and (some might say) offer more natural color reproduction. From the manufacturing point of view LCD displays are cheaper to produce as well.
That makes a big difference when you get to TV sizes, which is why OLED TVs are more expensive right now. The truth is that both OLED and LCD have been tweaked and improved significantly in recent years, fixing a lot of their respective weaknesses, so the traditional differences between the two are no longer as significant as they once were.
Recent Panasonic TVs, for example, use a special ‘honeycomb’ technology to improve the contrast performance of their LCD screens and better compete with OLED. Likewise the Quantum Dots technology in Samsung’s high-end Q9 TVs is designed to improve color reproduction and better compete with OLED.
Innovations like this are constantly arriving on the market and so in 2017 reading a few reviews of a smartphone or television is likely to give you a better idea of the quality of the display than whether it’s OLED or LCD.
One aspect of your TV, phone or laptop that isn’t quite as clear cut is the resolution. Resolution is the most common display spec you’ll see, besides the type of the display. Ostensibly it relates to how many pixels can be displayed—more pixels means its easier to see finer details. On a phone the common resolutions are 1920 x 1080 (HD or Full HD) or 2560 x 1440 (QHD); on TVs 4K (3840 x 2160) is becoming the standard. 5K (5120 × 2880) is also seeing some popularity—notably in Apple’s newest iMacs.
Yet frustratingly whether or not resolution makes much difference at certain screen sizes is a point of debate—if you’re sitting more than 6 feet from a TV than there is no discernible difference between a resolution of 4K and 1080p. Instead what matters is pixel density. That should be the same thing as display resolution right?
Apple introduced the term “Retina” way back in 2010 to refer to a screen resolution where it was impossible to pick out individual pixels with the naked eye at a normal viewing distance. For example: The iPhone 7 Plus has a display resolution 1920 x 1080. In a 55-inch TV that would equal a pixel density of 40 pixels per inch. Yet the iPhone 7 Plus has ten times that many pixels packed into the display (401 ppi).
Prioritizing pixel density over display resolution has become supremely popular in mobile devices. So when trying to decide on a device note the resolution and then note the pixels per inch. The high the PPI the better looking the display will be up close.
If you’re shopping for a TV or monitor you might game on, refresh rates come into play. They’re literally the speed at which the screen can be refreshed: 120Hz means 120 refreshes per second. A higher refresh rate, and your sports and video games are going to look smoother, while your films and movies will get some of that fun “Soap Opera” effect that may or may not be to your liking.
But be on the lookout for when a TV or monitor claims to have an “effective” refresh rates. The native refresh rate is what the panel is capable of. The “effective” refresh rate is usually much higher and involves some kind of video trickery to mimic what a higher refresh rate would present. That sounds great until the trickery fails, then you’ll see a lot of wonkiness in what you’re watching—from footballs disappearing to action heroes heads’ flickering in and out of existence.
“Effective” refresh rate wizardry is currently less of an issue in the smaller displays of smartphones, with flagship handsets usually sticking to 60Hz in most cases. Interestingly enough, the new iPad Pro features a dynamically changing refresh rate up to 120Hz, allowing for better responsiveness when you’re tapping the Apple Pencil or your finger on the screen.
One of the hottest display specs of recent years is HDR, or High Dynamic Range, which we’re written about before. HDR-enabled displays offer better balance between the darkest and the lightest parts of a picture, so you can still see the detail in a very gloomy or a very bright area of the screen. It also improves the color gamut, the range of colors that the display can show.
Because life is never simple, there are competing HDR formats to consider: The popular and free-to-license HDR10, the smarter but pricey-to-license Dolby Vision, and the new Hybrid-Log Gamma (HLG), which offers backward-compatibility with older sets. You may see one or more of these standards supported by your next TV.
Industry body the UHD Alliance has included HDR in its Ultra HD Premium badge of approval for TVs and other devices, putting down minimum requirements for colors, brightness, refresh rate, audio, and other specs that go to make up a 4K or better display. Both LED LCD and OLED televisions have done enough to earn the badge, which goes back to our earlier point about these standards being close together in terms of end results.
As display tech develops, you can expect to see manufacturers build even more improvements on top of the foundations of LCD and OLED, and add even more confusing terms and acronyms into the mix. On one level, the underlying technology doesn’t matter as long as you’ve got a good screen in front of you.
What matters more are the specs like resolution and HDR support that we’ve already mentioned, so you should now have a better understanding of what to look for the next time you’re in the market for a device (or a better understanding of the device you’ve already got)... at least until the next CES, anyway.