How good is HDTV anyway?

Three views of 'Lost'Of course, I love high definition. As I sit here writing, my son is playing ‘Call of Duty 2’ on an XBox 360, set to 1080i output, playing via a Denon AVR-4306 receiver (with Dolby Digital 5.1 audio) to an InFocus IN76 projector (720p DLP). Marvellous!

But there’s more to picture quality than just resolution, although resolution is a big part of it. Let us look closely at three screen shots, courtesy of anthonysimilion at the DBA Forum (I have used Capture 3 from his captures for no good reason other than it was the first set I viewed).

Now please excuse the size of the graphic to the right, but I felt it necessary to take a large section of each of the three shots to compare. The top one is from a Channel 7 SDTV broadcast of an episode of ‘Lost’, the second from the so-called ‘high definition’ version also broadcast on Channel 7 (so-called, because it’s only 576p). The final one he managed to dig up as an example of a 720p high definition screen shot of the same episode being broadcast on ABC TV in the United States.

The last shot is in the original resolution, while the first two have been scaled up slightly to match the size of the original. Now, which one do you think is best? (You may need to copy the graphic to your computer and slice it up into three pictures for side-by-side comparison.)

In the natural scheme of things, you might think that 720p would be better than 576p which, in turn, would be better than 576i. After all, at least notionally, 576p has twice as much information as 576i, and 720p has more than twice as much information (2.22 repeated to be precise) as 576p. But there are complications. The first is the source. Was this high definition in the first place? If it originated as 576i, then transmission at 576p or 720p doesn’t offer anything much, except possibly slightly better picture quality due to a professional grade scaler/de-interlacer at the television station end.

Only possibly, though, because there is no guarantee that the pro unit is better than a consumer level version in the home. The reason is that while pro equipment is normally better when it first comes out, TV stations don’t upgrade stuff unless there is a commercial imperative to. A professional scaler/de-interlacer produced circa 2000 is almost certainly inferior to a good quality unit built into some 2006 display devices.

Now let’s assume that the picture was, in fact, high definition, in the first place, say 720p. Then this still doesn’t settle the matter because the other major control over picture quality is the level of compression used. Digital TV uses some form of MPEG2 video compression. This works by tossing out information determined by the built-in algorithms to have the least effect on visual quality. In this sense, it is very similar to JPEG still image compression. But there are two differences. First, any particular frame in an MPEG2 stream tends to be more heavily compressed than that in a JPEG. You can get away with this in MPEG because the frames are flipping over 25 times per second, so any visual imperfections in one frame, unless they are too high in contrast, tend to get ‘ironed out’ by the quick sequential flow.

The second difference is that with MPEG2, the elimination of ‘unnecessary’ visual material takes place in the time domain in addition to the spatial domain. So, in fact, only every 12th or 15th frame in a movie is held fully (albeit in a heaving compressed format). All the intermediate frames are constructed from the previous or following fully-held frame, and ‘difference’ information held for that frame. Poorly implemented MPEG2 compression can often be spotted by a difference in subtle movement between parts of the image. For example, this might be a barely perceptible shifting of the facial features during a closeup, while the outline of the face remains steady. This can lead to a profoundly disquieting viewing experience.

But the more common effects of MPEG2 compression are:

  • random low level noise near high contrast boundaries in the image;
  • posterisation; and
  • reduced sharpness.

The higher the level of compression, the more likely some of these are to be evident.

Repeated compression/decompression/compression makes things especially bad. An example of this is many sporting events. It is rare that you will notice the noise issue with studio material or drama, but watch the cricket or (more topically) World Cup on a decent, large, high resolution screen and you will see a halo of noise following the players around the field. I speculate that the action is MPEG2 compressed at some point before it gets to your TV station, presumably for efficient transmission, and then is either transcoded or decompressed and recompressed during broadcast to you, emphasising all the MPEG2 artefacts.

So what has all this to do with our pictures to the right?

First, I think we can all agree that in this case the 720p image is ever so slightly sharper than the 576p one and this, in turn, is just a touch more detailed than the 576i one. But to my eye, the 576p version is the best of the three. The reason is excessive posterisation of the 720p image. Posterisation is the name given to a reduction in smoothness of the colour or grey-scale graduations, so that the picture looks less like a photo and more like a paint-by-the-numbers picture. Look around the left cheekbone as you will see how detail is lost not because of low resolution, but because subtle differences in colour have been merged into a mono-colour splodge.

So, in the end, it often all comes down to bitrates used by the TV station. Remember, the pixel clock rate for 576i is 13.5MHz and for 576p is 27MHz, but for 720p/50 is is a massive 74.24MHz! You need a lot more bits allocated to attempt to represent that realistically.

This entry was posted in DTV, HDTV, Video. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *