Home Entertainment Blog ArchiveBrought to you by your friendly, opinionated, Home Entertainment and Technology writer, Stephen DawsonHere I report, discuss, whinge or argue on matters related to high fidelity, home entertainment equipment and the discs and signals that feed them. Since this Blog is hand-coded (I like TextPad), there are no comments facilities. But feel free to email me at scdawson [at] hifi-writer.com. I will try to respond, either personally or by posting here emails I consider of interest. I shall assume that emails sent to me here can be freely posted by me unless you state otherwise. This archive is for an uncertain period commencing Thursday, 13 April 2006 | ||||||||||||||||||||||||||||||||||||||||||||||||||
Blu-ray Region Codes -
Monday, 26 June 2006, 4:39 pm
The Korean-based Dr David Steel, the Vice President, Marketing Team, Digital Media Business, for Samsung Electronics, has kindly provided the following information on Blu-ray region codes. There will be three codes:
| ||||||||||||||||||||||||||||||||||||||||||||||||||
How good is HDTV anyway? -
Monday, 26 June 2006, 4:05 pm
Of course, I love high definition. As I sit here writing, my son is playing 'Call of Duty 2' on an XBox 360, set to 1080i output, playing via a Denon AVR-4306 receiver (with Dolby Digital 5.1 audio) to an InFocus IN76 projector (720p DLP). Marvellous! But there's more to picture quality than just resolution, although resolution is a big part of it. Let us look closely at three screen shots, courtesy of anthonysimilion at the DBA Forum (I have used Capture 3 from his captures for no good reason other than it was the first set I viewed). Now please excuse the size of the graphic to the right, but I felt it necessary to take a large section of each of the three shots to compare. The top one is from a Channel 7 SDTV broadcast of an episode of 'Lost', the second from the so-called 'high definition' version also broadcast on Channel 7 (so-called, because it's only 576p). The final one he managed to dig up as an example of a 720p high definition screen shot of the same episode being broadcast on ABC TV in the United States. The last shot is in the original resolution, while the first two have been scaled up slightly to match the size of the original. Now, which one do you think is best? (You may need to copy the graphic to your computer and slice it up into three pictures for side-by-side comparison.) In the natural scheme of things, you might think that 720p would be better than 576p which, in turn, would be better than 576i. After all, at least notionally, 576p has twice as much information as 576i, and 720p has more than twice as much information (2.22 repeated to be precise) as 576p. But there are complications. The first is the source. Was this high definition in the first place? If it originated as 576i, then transmission at 576p or 720p doesn't offer anything much, except possibly slightly better picture quality due to a professional grade scaler/de-interlacer at the television station end. Only possibly, though, because there is no guarantee that the pro unit is better than a consumer level version in the home. The reason is that while pro equipment is normally better when it first comes out, TV stations don't upgrade stuff unless there is a commercial imperative to. A professional scaler/de-interlacer produced circa 2000 is almost certainly inferior to a good quality unit built into some 2006 display devices. Now let's assume that the picture was, in fact, high definition, in the first place, say 720p. Then this still doesn't settle the matter because the other major control over picture quality is the level of compression used. Digital TV uses some form of MPEG2 video compression. This works by tossing out information determined by the built-in algorithms to have the least effect on visual quality. In this sense, it is very similar to JPEG still image compression. But there are two differences. First, any particular frame in an MPEG2 stream tends to be more heavily compressed than that in a JPEG. You can get away with this in MPEG because the frames are flipping over 25 times per second, so any visual imperfections in one frame, unless they are too high in contrast, tend to get 'ironed out' by the quick sequential flow. The second difference is that with MPEG2, the elimination of 'unnecessary' visual material takes place in the time domain in addition to the spatial domain. So, in fact, only every 12th or 15th frame in a movie is held fully (albeit in a heaving compressed format). All the intermediate frames are constructed from the previous or following fully-held frame, and 'difference' information held for that frame. Poorly implemented MPEG2 compression can often be spotted by a difference in subtle movement between parts of the image. For example, this might be a barely perceptible shifting of the facial features during a closeup, while the outline of the face remains steady. This can lead to a profoundly disquieting viewing experience. But the more common effects of MPEG2 compression are:
Repeated compression/decompression/compression makes things especially bad. An example of this is many sporting events. It is rare that you will notice the noise issue with studio material or drama, but watch the cricket or (more topically) World Cup on a decent, large, high resolution screen and you will see a halo of noise following the players around the field. I speculate that the action is MPEG2 compressed at some point before it gets to your TV station, presumably for efficient transmission, and then is either transcoded or decompressed and recompressed during broadcast to you, emphasising all the MPEG2 artefacts. So what has all this to do with our pictures to the right? First, I think we can all agree that in this case the 720p image is ever so slightly sharper than the 576p one and this, in turn, is just a touch more detailed than the 576i one. But to my eye, the 576p version is the best of the three. The reason is excessive posterisation of the 720p image. Posterisation is the name given to a reduction in smoothness of the colour or grey-scale graduations, so that the picture looks less like a photo and more like a paint-by-the-numbers picture. Look around the left cheekbone as you will see how detail is lost not because of low resolution, but because subtle differences in colour have been merged into a mono-colour splodge. So, in the end, it often all comes down to bitrates used by the TV station. Remember, the pixel clock rate for 576i is 13.5MHz and for 576p is 27MHz, but for 720p/50 is is a massive 74.24MHz! You need a lot more bits allocated to attempt to represent that realistically. | ||||||||||||||||||||||||||||||||||||||||||||||||||
What output resolution to use for HDTV? -
Thursday, 15 June 2006, 11:57 am
Okay, you have a nice shiny new high definition TV receiver. You work through its setup menus and discover you can set its high definition output resolution to 576p, 720p or 1080i. Which should you use? Preferably, none of them. Some HDTV receivers also offer a 'through' option (eg. the Toshiba HDD-J35 and the Topfield TF7000HT). This sends the picture on at whatever resolution its was transmitted at. So when you switch to an SD channel, the picture comes through at 576i. On SBS HD or ABC HD it will be 576p, on WIN HD or SCTEN HD it will be 1080i. Why? Almost certainly the vertical resolution of your display (in widescreen mode) will be one of the following: 480, 576, 720, 768, 788, 1024 or 1080. That, I think, covers all the vertical resolutions of all the displays I've seen (that 788 is for LCoS projectors with a 1,400 pixel wide resolution). In terms of common resolutions, there are 480, 576, 720, 768 and 1024. As you can see, only two of those match the output options available from your HD STB. One way or another, much of the HD programming is going to have to be scaled from the signal's native resolution to the display resolution. The question is, where? The STB can do it. Or your display can do it (except for CRT TVs, which often demand a 576i or, at most, 576p signal). Which one in your setup that does the best job is for you to work out by experimenting. But what you do not want is for both of them to be doing it. Plasma and LCD displays are most commonly 480, 768 and 1024 pixels tall. None of these matches the output resolution available from an HD STB. Scaling can be done to a very high standard, but inevitably any manipulation of the picture induces some degradation. You typically won't notice this if you're using high quality gear. But what you don't want is for scaling to be performed twice. And that's what will happen if you're using, say, 1080i output from your STB to watch 576i or 576p material on a 768 or 1024 pixel tall plasma. The exception is if you have a 720p projector -- a very common resolution. In that case, once again, experiment with setting the HD STB to 720p output, and to 'through' output. See which gives the best results. In my experience, the scalers in projectors tend to be a bit better than those in HD STBs. But one caution: find out if there is a 'film mode' or 'cinema' option in the projector's menus. These sometimes force 'weave' deinterlacing, which means that video sourced interlaced material will appear with marked combing, and look horrible. If that's the case, switch that mode off. It seems to make no noticable difference with film-sourced PAL material anyway (I think these are primarily for NTSC signals). | ||||||||||||||||||||||||||||||||||||||||||||||||||
Pioneer moving out of DVD recorders? -
Wednesday, 7 June 2006, 5:33 pm
There's a story around that Pioneer is pulling out of the DVD recorder business. I wouldn't like that because I think Pioneer's DVD recorders are the best around at the moment. So I asked Pioneer Australia for its comments. It provided Qs and As (Word DOC format, 27kB, right click and choose 'Save Target As ...') from the parent company and offered the following comment: I have attached an announcement from Pioneer, which details that Pioneer is not pulling out of manufacturing DVD recorders all together. The newspaper story was inflated.I don't think I need to say anything more. UPDATE (Thursday, 15 June 2006, 11:54 am): Pioneer's new Q&A, issued after its 6 June management meeting:
| ||||||||||||||||||||||||||||||||||||||||||||||||||
Samsung promises first Australian Blu-ray player -
Wednesday, 24 May 2006, 10:27 am
At yesterday's product showing, Samsung demonstrated a Blu-ray player. It says it intends to be the first on the Australian market with a Blu-ray player, slated for the fourth quarter this year, and it's aiming at a price of under $AUS1,000. And it's hoping for a Blu-ray recorder early next year. Samsung is definitely behind Blu-ray rather than HD-DVD. It says that of the seven major Hollywood film studios, six have promised Blu-ray titles, while only three are supporting HD-DVD. If that continues, then Blu-ray will be the winner here, but I'm still inclined to think that eventually we'll have dual format players, and possibly recorders. Both use the same wavelength laser, although the focusing/protective layer on the discs themselves differ (0.1mm for Blu-ray, 0.6mm for HD-DVD). I'm presently trying to find out more from the good folk at Samsung about the 1080p issue for Blu-ray that I've previously wondered about. | ||||||||||||||||||||||||||||||||||||||||||||||||||
Philips leads the field with digital TV/DVD recorder -
Tuesday, 23 May 2006, 11:10 pm
Well, today was busy. First I went to a Samsung product showing in Melbourne, and then a Philips one in Canberra. Both Samsung and Philips have some interesting new technology to enhance black levels in LCD TVs, and the new Philips LCD TVs add a further enhancement, in the form of LCD backlights that scan the picture at 75 hertz, only highlighting the pixels during the time that the pixel setting has settled, effectively reducing the pixel response time for markedly improved coherence on moving images. But the really excting thing at the Philips launch was its new DVDR9000H DVD recorder. It, and the next model down, feature HDMI outputs (with video upscaling). What really distinguishes this model is that it has a standard definition digital TV tuner built in. If I'm not mistaken, this will be the first consumer DVD recorder so equipped. If all goes well, that ought to give a significantly better picture quality in your recordings, even compared to using an SD box plugged into an existing DVD recorder.
This unit is pricy, with the initial RRP set at $1,699 for the June 2006 release. But for that, you also get 400GB hard disk, Philips' I've put some questions to Philips about the unit to satisfy a few concerns and hopes I have:
Reading the spec sheet further, I see that the answer to '5' may be provided: Digital 5.1-channel recording lets you capture – along with the video – the original sound from digital multichannel sources such as satellite receivers to store it on DVD. The DVD Recorder can record transparently the sound fed to the digital audio input, such as Dolby Digital, DTS or MPEG Multichannel.I wonder if it records all of them? UPDATE (Wednesday, 24 May 2006, 10:43 am): My brother informs me that the Philips DVD recorder buffering system isn't unique at all (corrected above), but is also used in some low cost hard disk DVD recorders such as 'MTV' and 'Kross'. These use DVD+R/+RW as their staple, so I imagine they've picked up some technology from Philips. Or maybe not. | ||||||||||||||||||||||||||||||||||||||||||||||||||
iPod weirdness -
Thursday, 13 April 2006, 3:52 pm
Since home theatre receivers are now coming thick and fast with bridges for Apple iPods (Gen 3 and later), Apple has kindly lent me a very fancy iPod indeed. It does video, has a 60GB hard disk and so on. This device is a marvel of electronic engineering. You wouldn't know that the hard disk is spinning because it seems to be perfectly silent. It's amazingly compact. The display is clear. The controls are a marvel, especially the touch sensitive volume/scroll control. And while I haven't checked the output levels yet, it seems to have plenty of oomph to drive even my less sensitive headphones. Now for the downside. We have a house full of Windows computers. I know Windows, so I'm far more comfortable with it than the Mac, and to be honest, from my experience the rhapsodies in which Mac owners commonly engage are vastly overblown. So I can't comment on how well an iPod works with a Mac. But with a Windows computer, the software is appalling. My first recent experience was on Christmas day. We gave my daughter an iPod Shuffle and I then spent two or three hours trying to get it to work. With multiple reboots of both computer and the Shuffle, loading updated firmware several times, and so forth. Incredible. Part of the problem was the failure of the software to let you know what was going on. It'd be doing something, but there would be no indication of it, so it looked like the process had frozen. Mind you, this is after waiting for several minutes. Anyway, when the proper iPod turned up, I resolved to do things very carefully. No precipitous closing of programs or rebooting. The first thing I noticed was that no guidance was given on conflicting messages. Windows would throw up a message asking whether you wanted to reboot, but the install program would throw up a different message on a different subject. Which to do first? How about some decent documentation! Presumably I made some wrong choices, because this time it took me more than three hours to get things going. I should point out at this stage that I've installed a dozen or more MP3 players from other major brands (Creative, iRiver, BenQ, Sony, Samsung, Topfield and so forth) and never once had any problems. Even with Sony's silly interface software. When I did get it going, things were incredibly slow. Windows kept popping up a message asking me why I was using a USB 1.1 port, when I should be using a USB 2.0 port. Problem was, it was a USB 2.0 port on the front of my computer. I'd been using this port a couple of days earlier with a Topfield player, and it was downloading a whole album in six seconds. The iPod was taking 72 seconds. I resolved that one by switching to a rear port. I checked the front port and the Topfield was still as fast as ever. With the rear port, things were acceptable, but the iPod still took around 14 seconds for a download of the same album. The iTunes software is a dog. It takes ages to start up, and then more ages to recognise the iPod, the plugging in of which invoked it in the first place. If you have updates set to automatic, then apparently you can connect and disconnect the iPod at will. But that means ignoring the prominent 'Do not disconnect' message on the iPod's screen. With automatic updates set, you cannot drag additional tracks to the iPod. If you mark a track for download, then tough luck. You have to wait until the next disconnect and reconnect to make the download happen. If you set it to manual updates, you are warned that you cannot disconnect the iPod until the software is closed down. Both systems are highly inconvenient. Finally, the way that things are organised on the iPod is really weird. In its favour, it installs itself as a mass storage device, so you can copy files to and from it easily. But don't bother doing that with music. You must use iTunes for that purpose, because the iPod uses the weirdest system of non-organisation I've seen. First, every MP3 is renamed to something like ABCD.mp3, that is, four upper case letters followed by a lower case 'mp3'. There is no relationship apparent between the four letters and the original track title, nor its contents. Second, tracks are placed, seemingly at random, in a folder entitled FXX within the iPod_Control\Music folder. By XX, I mean two numerals, from 01 to whatever it takes. You'd expect that each folder would contain all the tracks from one artist, or one album or some such. But things just aren't organised that way. The graphic shows some of the contents of one folder. Note, that virtually all of these tracks are from CDs, the other contents of which are also on the iPod. Pity about this, because the hardware engineering of the iPod is just so damned good. |