Peter comments on the previous post that the picture quality of The Fellowship of the Ring has been criticised at blu-ray.com.
That site seems especially critical of what it claims to be digital noise reduction applied to this movie. If anyone knows how you tell digital noise reduction applied as part of the disc production process from digital noise reduction — or other processes — applied in the creation of the movie in the first place, please tell me. I’d genuinely love to know.
Anyway, I foolishly put a 4.5 star rating on the top of my Blu-ray vs DVD comparison when I started preparing it, and neither re-assessed it afterwards, nor thought better and removed it completely until after I’d actually watched the disc. I’ve now removed it until I can give a proper assessment.
As I remarked to my daughter as I was nearing the end of the comparison, the picture seemed a bit soft, but I figured that Peter Jackson had done it intentionally that way in the movie itself to ensure cleaner integration of the CGI stuff into the live action.
That theory remains valid. The site mentioned above suggests that LOTR2 is much better, and LOTR3 better still. But that could be because Peter Jackson’s teamĀ got better at creating convincing CGI with practice. Thus my query above.
Anyway, if you look at the figures in my post, the actual average video bitrate goes in the reverse direction: a reasonable 23.35 for the first one, falling to 22.75 for the second movie, and then a marked reduction to 19.14 for the third one.
Once again, this demonstrates that video bitrates at reasonable levels do not seem to correlate well with picture quality. But it also raises the question: why would DNR be used on the first one, but not the second? The point of DNR is to reduce noise, which being random is hard to encode in a lossy system such as VC1. Oh, it might have been done to make it look glossier, but why would you do that to one and not the others.
In other words, given that Fellowship and Two Towers are about the same length, have about the same video bitrate, and are being released at the same time, why would different production techniques be applied?
I don’t know, but in one respect they certainly were.
My comparison technique involves ripping all the ‘I’ frames from the Blu-ray as a first step. The ‘I’ frames are standalone ones that do not require other frames to be reconstructed.
As it happens, there somewhere between 12,000 and 12,999 ‘I’ frames (I have deleted all but about 30 of them, so I don’t have a precise count) in the Fellowship of the Ring Blu-ray, and 20,721 ‘I’ frames in The Two Towers. So about every twelve-and-a-halfth frame on LOTR2 was an ‘I’ frame, and every twenty-and-a-halfth frame on LOTR1 was an ‘I’ frame. Can this make that big a difference to picture quality?
Update (9:55pm): I’m selecting frames from The Lord of the Rings: The Two Towers in preparation for the comparison. There is no doubt that this installment is a great deal sharper than The Fellowship of the Ring. The blu-ray.com review mentioned above hypothesised that the Blu-ray version of LOTR1 may have been pulled from an early telecine of the film. But while the framing between the DVD and Blu-ray versions of LOTR1 are slightly different to each other, for LOTR2 this isn’t the case at all – the appear to be identical. These are more likely to have come from the same telecine than the two versions of LOTR1.