A few weeks back, one thousand of our readers participated in our MP3 bitrate test. Today, with the little help of a stats expert, we have results—and a recommended rip rate that most of you can live by.
Readers who took the test listened to three songs at varying bitrates on their own sound systems, and identified the threshold at which encoding quality stopped mattering to their ears. After statistically evaluating the results, we not only found that there's a bitrate that most of us can live by, we found that there is joy to be gleaned from uncompressed audio, especially if you spent money on your sound system.
Our Finding
If you're encoding MP3s in iTunes, do so at 256kbps. Why? The mean peak bitrate that users reported distinguishing across all three songs tested was 218.68kbps (when we removed WAVs, the clear outliers, from the results). Aim a bit higher than 218kbps, and you should be set. (Notably, users reported different bitrate results across songs. I'm betting that we see that the quality of a source recording—even within CDs—can really make a difference even when a song is compressed.)
Of course, our data is ever so more interesting than just our advice regarding MP3 bitrate encoding. Other findings:
19.65% of all participants responded that WAVs sounded better than MP3s in at least one of the three songs they tested. While the superiority of WAVs could be an imagined difference (our testing wasn't blind), is it so hard to believe that uncompressed audio is noticeably better? With the ever expanding waistlines of even laptop hard drives, maybe uncompressed audio (or even lossless audio compression like FLAC) is worth consideration.
Still, our most interesting finding was a statistically significant correlation between the amount a listener spent on their audio equipment and the maximum bitrate they could detect. In other words, the more expensive a participant's stereo, the higher the bitrate they preferred.
Why such a noticeable correlation? There could be a variety of explanations. Distinctions in bitrate may be easier to discern in more acoustically responsive audio equipment (that's generally more expensive). Purchasers of higher end audio equipment may simply have better ears. Or, of course, those who spend the most on their speakers might just be deluding themselves in their own snobbery.
Really, the correlation could be a combination of all three of those factors...or none of them.
It was a fun test and we're pleased that so many of our readers took time out of their other responsibilities to participate. If nothing else, we got to make some mean graphs. Get it?
* Reported results are based upon data collection from 743 complete surveys of over 1000. Data reported without WAV outliers from some of our results was from 597 complete surveys.
FAQs
Why didn't you guys test FLAC or something?
Face it, the average person opens iTunes to import their CDs in MP3 format. They aren't downloading special third party software. So this test was for them. Mankind can perform additional tests in the future, you know. And besides, if a format is truly lossless, the WAV test satisfies the category.
Why didn't you blind test?
Quite simply, reliable blind testing wasn't feasible. Even if we didn't disclose the the samples' bitrates, users could easily find the bitrate through metadata or comparing file size. It's a limitation that we acknowledge, and we're not drawing any unwarranted conclusions by taking this limitation into account. Furthermore, many sound experts feel that blind testing is actually flawed. We won't go into it here, but there are arguments on both sides.
A special thanks to Definitive for supplying us with two of their wonderful Mythos STS Supertowers ($3,000/pair) and Pioneer for lending us a recently released VSX-1019AH-K ($500), a solid receiver with notable iPhone/iPod integration.