Do audio bit rates matter? With iTunes enhanced-bitrate music coming in a month, we were hoping for a vast improvement. But can anyone tell the difference between a music track encoded at 320kbit/sec, 160kbit/sec and oh, lord, that holy grail of audiophile nirvana, the uncompressed wav? Our friends at Maximum PC decided to put audio compression to the test, enlisting four people to listen music first in uncompressed form, and then encoded with a variable bit rate at 320kbit/sec and the lowly 160kbit/sec. It was easy for everyone to tell the difference, right? Right?

It's downright humiliating, in fact, that in many cases, we were unable to tell the difference between an uncompressed track and one encoded at 160Kb/s, the bit rate most of us considered the absolute minimum acceptable for even portable players.

Most of the time, even a golden-eared audiophile couldn't tell the difference between uncompressed and highly compressed audio. These results are roughly similar to the Slate Explainer we referenced a couple of weeks ago. A useful fact is that the compressed files were hard to detect because they were encoded using a variable bit rate, which makes a huge difference in complex musical passages that might suffer from compression. What a revealing test, and a great read!

Do Higher MP3 Bit Rates Pay Off? [Maximum PC]