I’ve been struggling to get my head around this for some time. The problem has been a lack of replicable test signals. But I think I’ve got it nailed, at last.
I was watching a late night movie on ABC TV recently. Normally these are old black and white things, but this one happened to be in colour. It wasn’t particularly good. This was on digital TV, and I was using a Strong SRT-5490 HD PVR. This was feeding via its DVI output to the HDMI input on a BenQ 1080p DLP projector. But not directly. The signal was going via a DVDO iScan VP50, one of the best video processors in the world.
The output of the Strong was set to 1080i, and the VP50 was converting this to 1080p.
As I was watching, I noticed that there was a moire pattern on the fine texture of the necktie of one of the characters. So I recorded a snippet of the video.
Many of these late night movies are of appalling quality, primarily because the video appears to have been derived from an NTSC source, and converted to PAL.
You can convert NTSC to PAL by doing some heavy processing: reversing the 3:2 pulldown so that you end up with the original film frames, and then recompiling them into 25 frames per second PAL. But these movies are rarely like that. Instead, they are converted to PAL directly from NTSC, with most of the frames showing heavy interlacing. That is dealt with by video processors by eliminating the second field — at least for moving parts of the picture for high quality processors such as the VP50 — further softening what is already pretty fuzzy video.
But this particular movie didn’t suffer from that problem. It appeared to have been telecined from the original film, and so was actually progressive in nature. I downloaded the short recording onto my computer, and examined the video closely. Yes, it was definitely progressive.
At this point, I would suggest that you read my article on problems with DVD player deinterlacing, especially the section headed ‘Deinterlacing video-sourced content’. As you will see from this, a moire effect often happens when fine patterns on progressive source video are inappropriately deinterlaced using the ‘bobbing’ method. That’s what seemed to be happening in this case.
The ABC TV program was broadcast as 576i. The procedure for converting 576i video to 1080i is to first convert it to 576p (ie. deinterlace it) and then scale it up to 1080. I’m fairly confident that the Strong SRT-5490 just bobs all 576i material on the way to upscaling it to 1080i. I strongly suspect that this is also done by all HD TV receivers on the Australian market. I shall confirm this over time.
So what we really need is a HD TV receiver that can either output 576i over a HDMI output (DVI outputs do a weird output when set to 576i, which isn’t compatible with all the displays I’ve tested), or incorporates high quality video processing that checks the interlaced status of the actual video, and then weaves progressive source material.
UPDATE (Monday, 30 April 2007, 4:26 pm): Further on this, with real proof, in entry ‘Proof of crappy video output from HDPVR‘.