Once again, the finger is pointed at digital noise reduction, but other stuff as well. Let me quote from this review:
Detail is far from consistent, as from shot to shot in any scene, it’s almost like watching the film from multiple grade sources culled together. There are moments were distance shots boast brilliant clarity and the finest of minute detail, then a close up will follow that’s muddled beyond belief. While I cannot say what created this issue, I can say that DNR (yes, Digital Noise Reduction) played a large part. There are numerous excessively smoothed and muddy moments, and they’re hardly difficult to spot.
Now here’s my problem: if you want to apply DNR to a movie to prepare it for transfer to Blu-ray — for whatever reason — how do you do it? Do you sit down and carefully apply different levels on a scene by scene, moment by moment basis? Or do you click the ‘Mild’, ‘Medium’ or ‘High’ radio button in the DNR section of the encoding software, and let the automatic processes do their work. I suspect the latter.
That would give a consistent look, though, not a variable one.
So what if all the complaints about this movie should apply not to the Blu-ray but to the movie itself? What if the Blu-ray rendition is actually a very good representation of the movie, as it originally appeared at the cinemas?
Might it be? I don’t think we’ll ever know.
You see, movie reviewers rarely seem to comment on technical issues, except when they interfere with the story telling. When was the last time you read a movie review in a newspaper or magazine — and here I’m talking about current release movies, showing only at the cinema — which devoted a paragraph to the amount of detail in the cinematography, differences from scene to scene in film grain, and so on. No, they are all about the story and the characters.
In any case, back in 2001 the reviews I read suggested that the viewers were mostly blown away by the special effects. Now, nine years later, the special effects sometimes seem rather too obvious. At the time they didn’t. Each new generation of special effects seem convincing to that generation, but after a few years seem artificial. Especially animation ones. Presumably people back in the 1980s people thought that Terminator Arnie’s metal skeleton walking through the fire, or his fake head having its eye extracted, were realistic (enough, anyway, to suspend disbelief). Likewise, ED209 in Robocop is horribly clunky.
Blu-ray reviewers are far more interested in the technical stuff. They’re selected for their roles by that interest. I’m like that too. Last night I watched a couple of episodes of ‘Claymore‘ on Blu-ray and ended up spending more time rewinding and rewatching a few short segments, over and over, trying to nail down an issue in the picture quality, than actually watching the show as a story. How many reviewers will even notice that there’s a strange shimmer on the grill of a car 1:22 into Public Enemies. Movie? Disc? Equipment? Still to be determined.
All of which leads me to a question, and I’d appreciate feedback on this one. The ‘Video Quality’ rating on my reviews, and just about every else’s Blu-ray reviews, is meant to summarise our discussions of the picture quality. But should we give five stars for a great looking final result, or five stars for an extremely accurate rendition of the movie?
What if The Fellowship of the Ring was a bit weak in terms of picture quality at the time that it was released to cinema (they were still learning, I expect, and would have gotten better in the next two installments)? What if they did (as I suspect) soften the overall image during the movie production just a little to better disguise the live-action/special effects transitions? Is this the fault of the Blu-ray producers?
I can easily imagine that the actual transfer to Blu-ray of a movie might be the most accurate ever achieved, indeed the most accurate achievable, yet score poorly because the movie itself is full of less than transparent digital effects.
And then we get to older movies. On Blu-ray Dr No looks good, and The Godfather looks much improved, but both of these were restored. How many stars do we give to the Lowry Process used on the former, even if some of the detail is actually invented by the automated digital processes, rather than originally appearing on the film?
So, should I give five stars because the Blu-ray of a movie looks good, or because it is accurate? And how do I know if it actually is accurate? And what do I do if those two things are in tension with each other?