Efficiency of Audio Codecs

Elsewhere it was pointed out that in my earlier post I was counting in the bitrate of Dolby TrueHD the standard Dolby Digital core, and that this is strictly not necessary for TrueHD. The reason I do that is that I’m actually interested in two things: the relative efficiencies of DTS-HD MA and Dolby TrueHD as codecs, and separately as real world implementations.

In the former, Dolby TrueHD is clearly more efficient. It uses a predictive algorithm for the audio, plus sample-by-sample tweaks to provide full precision. DTS-HD MA is somewhat similar, except that instead of a predictive algorithm, it uses the DTS core as the thing to be tweaked.

That was a sensible choice on DTS’s part, given the need to have the standard DTS-backup anyway. No doubt they would have come up with a more efficient algorithm, absent that.

The fact that standard DTS has a fixed bit rate means that it will always be somewhat inefficient, since the sound in sections of most movies would not need the entire available number of bits to be completely encoded.

I am also interested in the total real-world bitrate, which in the case of TrueHD includes the Dolby Digital element. In a fully equipped home theatre system, the DD part is a total waste of space since it is not used at all, but is provided merely to support legacy equipment. Still, it is there and needs to be counted as part of the package.

This entry was posted in Audio, Codecs, Compression. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *