Contrast Ratio Shoot-Out (Everyone Loses)

We may earn a commission from links on this page.



Pioneer lined up its newest plasma display next to top-of-line TVs from Panasonic, Samsung, Sony and Sharp today. An interesting experiment, for sure. And it got everyone in the room talking about the same thing: contrast ratios. You can see why just by looking at the image above, which shows three plasma screens with power on but no image (the Pioneer is on the lower right, the Samsung HP-T5064 is on top and the Panasonic PZ700U is on the left). None of the screens were calibrated, which would make a difference. But the reason Pioneer's screen looks so much darker has to do with a lot more than calibration, or contrast ratio.

Advertisement

Consumer electronics companies love spec wars. Whether it's processor speeds, throughput or megapixels, gadget makers like throw around big numbers that separate money from wallet. And contrast ratios are the spec war du jour. But despite claims ranging from 5,000:1 (Panasonic) all the way to 1,000,000:1 (Sony's upcoming OLED) there is no agreed upon industry standard used for measuring contrast ratios. As a result, there are a number of tricky ways to influence the outcome of a contrast ratio test—and none of them have anything to do with the real world contrast ratio that you will experience while sitting at home...

Advertisement
Advertisement



What is contrast ratio? Simply put, it's the difference between the darkest and brightest spot on a display. This is expressed as a ratio measuring luminosity. A good way to think of contrast is like the volume on a stereo. You might have a stereo that goes all the way to 11, but that doesn't mean it sounds good. The environment has an effect too—the advertised contrast ratio has no bearing on how well the screen will perform in your house while reflecting light from a bank of windows on the other side of the room. Props to Sony for admitting as much, even if it is only in the fine print:

VESA test and measurement methods are applied yielding a contrast ratio of used 7000:1. This number represents the widest possible ratio between black and white contrast levels. Sony also measures their BRAVIA televisions with a more stringent method that measures the amount of black and white levels that can appear on the screen at the same time. This method yields a more real world measurement of 1300:1

Advertisement



LCD shoot-out: the Sony XBR is on top, Sharp Aquos lower left and the Pioneer plasma is on the right.

Advertisement

What contrast ratio is not: Contrast has little to do with color range or accuracy. Color is all in the gray-scale capabilities of a screen. But with better and deeper blacks some TVs are also better able to show the gradations of color as they fall away.

How is contrast ratio measured? In a very dark room, first of all. The big number on the front end (i.e. 20,000) is the light side, and the 1 is the dark side. The dark side, therefore, has a much greater effect on the ratio than the light. Cutting the darkest dark on a screen by .5 effectively doubles the contrast ratio. Which is why you hear a lot about "true black" and never about screens as bright as arc welding torches that you need shaded lenses to view.

Advertisement

How is contrast calculated? Contrast ratio is calculated in many different ways. The basic idea is to make one part of the screen dark and one part really bright and measure the difference using a light meter. Among the best known is the "full on/full off" method. There is also ANSI, which was developed as a way of measuring the contrast in projectors. Another commonly used method is the VESA (Video Electronics Standards Association) Flat Panel Display Measurement (FPDM) standard. And in Japan, the JEITIA (Japan Electronics and Information Technology Industries Association) rules the roost.

Advertisement

I was going to detail each one of these methods, but what's the point: they are all flawed and equally meaningless because no one runs the tests in exactly the same way. There is no ideal setting, for one thing, no room that is perfectly black. And just what black is being measured? You could turn the TV off entirely, which results in a screen that is darker than when the display is powered up. Some companies measure contrast without the filter on the front of the screen, which yields much brighter whites. Another option is to measure the black in a dark room, but then move the display to a well-lit room and turn it on to measure the light side. All of these examples are taken from interviews with people who work in the display industry and asked to remain anonymous.

Isn't someone developing a standard method for measurement? The Consumer Electronics Association is discussing the possibility, but nothing is moving forward. And it probably won't until there is enough hue and cry for a change to occur.

Advertisement

If there were a standard, what would it be? Instead of using one big number, the better way to talk about contrast ratio would be based on real world settings. TVs should specify contrast in terms of watching a broadcast football game during the day versus a high-def movie at night with all the lights off. More confusing, but also more accurate.

So what should I make of all this contrast ratio bragging going on? The best bet is to ignore it entirely. It is marketing hype. Contrast ratio is the least important number to look at when making a purchase. Judge with your own eyes—it's worth noting that your eyes recognize contrast far more clearly than resolution, which is why a 1080p screen with low contrast may not look as good as a 720p screen with high contrast. Enthusiast magazines such as Perfect Vision and Home Theater do their own contrast ratio tests on screens they review, which is a good option. But ultimately, those publications agree with me.

Advertisement

This topic always generates some hot commenting action. Fire away and I'll answer questions the best I can. But I've got a question for the readers as well: Is contrast ratio a useful, albeit flawed, way of comparing TVs and announcing new TV technologies, or should Gizmodo ignore that number from now on?