The death of the paper publishing

Amazon has announced the ‘Kindle DX‘ portable document reader. Much larger screen and native PDF support.

Starting to look very impressive, and makes electronic books and magazines a real possibility. To seal the deal, it needs to be thinner still, more robust (or flexible — even able to be rolled up) and offer a colour display.

Posted in Computer, General Tech | Leave a comment

Back up, and thanks!

I’m back. An excellent Telsta/Bigpond tech guy came around first thing this morning and got me going. Something was definitely wrong at the other end. He provided a new Bigpond two wire gateway for my ADSL connection. My old Alcatel SpeedTouch ADSL modem, supplied by Telstra when I first signed up years and years ago, bears a sticker on the underside declaring that its warranty expires in September 2001.

Once that was going, my Belkin router refused to provide a path through to the modem, so I figured it was time to replace it. It took me three stores to find a non-modem, non-wireless router. Apparently I’m pretty old-fashioned.

Found a Netgear one. Took it home and installed it, and stuff all seems to be working properly.

A couple of times during all this I rang RG Computer Repairs, which I found in the phone book. I was just after a small outfit that could be flexible and was local. RG is only a couple of suburbs away.

As it turned out, Rodney Green (it’s his business) didn’t need to come out, but spoke great sense over the phone. So I just wanted to say to anyone in the Canberra area: if you have a problem, try RG Computer Repairs on 6291 5444 or 0404 041 921.

Posted in Admin, Computer, General Tech | Leave a comment

Speed demon

One word describes why I felt I needed a new computer: Blu-ray. My old computer would not play Blu-ray discs in any watchable way. Not that I watch them on a computer, but when checking things out, half the time I would switch off the speakers because the stuttering sound was so irritating. With the new computer (Intel Core Duo E8400 running at 3GHz with 4GB of RAM), PowerDVD plays them smoothly and well.

I also scan lots of Blu-ray discs with BDInfo in order to learn more about their inner workings. So far I’ve scanned about 70 of them. On the old computer (a Pentium D) that took a lot of time. A typical dual layer disc would take three to four hours, and use 40% or more of CPU, seemingly increasing as the process continued. I took to starting a scan going when I knew I’d be away from the computer for a few hours, usually at bed time.

I’ve just scanned the Blu-ray of Sleeping Beauty (DVD vs Blu-ray comparison coming soon). The disc is 31.5GB. I would have expected it to take at least three hours. It took 37 minutes!

Now, if only I had Internet!

Posted in Admin, Blu-ray, Computer, Disc details | Leave a comment

Apple iTunes, a masterpiece of obfuscation

Goodness knows when you will see this. Internet for me, at the moment, is thoroughly dead. New computer. Spent Thursday night loading stuff. Likewise Friday morning. During one of its reboots, its graphics card failed. The shop people actually came out and switched it over. Loaded more stuff. Friday night, Internet stopped working. Thought it may have been at the other end, so I waited until this morning. Spent an hour and a half on the phone with a nice lady from India, who eventually decided to kick it upstairs to Telstra Bigpond (my ISP). They will be on to me within two working days. Since it is Anzac Day here today, and since it is a Saturday, Monday is the holiday. So I’m hoping that by Wednesday something will be happening.

Meanwhile, if I don’t put my thoughts down when they occur to me, they will never make it here.

So I’m still installing software in the meantime. The computer itself still works. Time to put in iTunes. As I’ve said lots of times, I love the iPod hardware and detest the iTunes software.

I go looking for the iTunes installation software. I have a CD, of course, from a couple of years ago. But I prefer to load in the latest — the latest, that is, to which I have access in the absence of the Internet. I find that I have the 60MB iTunes setup program for, it turns out, version 7.7. Not too old, and the file is dated July last year, so I figure that will do for the time being.

It installs, but tells me at the end that the file ‘iTunes Library.itl’ is newer than the current version of iTunes. I try to start up iTunes. It tells me the same thing and won’t start.

No hint on what to do. None of the bleeding obvious: ‘Would you like to delete this file and proceed?’ No, it just leaves you hanging. I tried to invoke it by plugging in the iPod. But it delivered the same message, leaving me with an iPod connected and ‘Do Not Disconnect’ written on its screen (it’d probably be okay to disconnect anyway, but you can make certain by clicking on the ‘Safely Remove Hardware’ icon in the System Tray and following the instructions).

Anyway, I’ve deleted that file (it’s under ‘My Music’) and restarted. Not hard, but if you were a novice, what would you do next?

Posted in Audio, Portable, Rant | Leave a comment

Day and Date everything

Warner Bros says that it is proposing to release ‘day and date’ (where this term comes from I don’t know, but it means ‘on the same day’) The Curious Case of Benjamin Button in just about all possible formats on 3 June 2009. That is, it will be released as a DVD, and as a Blu-ray, plus Foxtel Box Pay Per View, iTunes, Bigpond and, says Warner, ‘the online presence of various rental chains’.

One of the official reasons is: ‘we are offering a legitimate alternative to piracy’.

Sounds like a good idea to me. I don’t like the idea of Pay Per View, not the various on-line distribution mechanisms. But I’m all in favour of customer choice. Offer it all ways and let customers choose the format that best suits them.

Posted in Blu-ray, Disc details | Leave a comment

New Blu-ray disc notes

Icon Film Distribution will soon (1 May, I think) be releasing Slumdog Millionaire on Blu-ray. My Blu-ray vs DVD comparison for this title is here. A sample:

Slumdog comparison

That company will also be releasing Transporter 3 on Blu-ray on 10 June 2009.

Paramount is releasing a bunch on Blu-ray over the next few weeks including Madagascar: Escape 2 Africa, an SE version of Madagascar, The Heartbreak Kid, Hotel For Dogs and Ghost Town. What I’m looking forward to is what’s due in mid-June: Saturday Night Fever, Grease, Disturbia, The Truman Show, Ferris Bueller’s Day Off and Ghost.

Universal is releasing The Changeling on Blu-ray on 17 June 2009 and The Unborn on Blu-ray on 1 July 2009.

Warner Bros is releasing a bunch, too, in what it calls a June Wave: American History X, Robin Hood: Prince Of Thieves (Extended Cut), 2010: The Year We Make Contact, The Assassin (this is the remake of Nikita, with Bridget Fonda and was known in the US as Point of No Return), Amadeus (Director’s Cut), The Wedding Singer, A Time To Kill, Collateral Damage, Body Heat, The Gauntlet, Tango & Cash, and Above The Law.

That qualifies as a wave in my book.

Posted in Blu-ray, Disc details, DVD | Leave a comment

‘600hz sub fielding’

A member the public has written to one of my editors asking, in short, what is ‘600Hz sub fielding’? The editor passed it on to me, and my initial thought was: ‘I have no idea’. Fortunately our correspondent quoted some material he’d found on the Web. So I did a bit of googling around myself and found that this seems to be a Panasonic technology for its plasma displays, and was preceded by a 480 hertz version, which Panasonic named its ‘480 Hz sub-field drive’. This version is described here. It seems likely that the only difference between the two is that the 600 hertz version does more of the same.

Here’s what it says:

A standard video signal is actually a series of still images, flashed on screen so quickly that we believe we are watching a moving image. The typical frame rate used in North America is 60 frames per second (60Hz) meaning that a TV would display 60 individual still images every second. Sub-field drive is the method used to flash the individual image elements (dots) on a plasma panel. For each frame displayed on the TV the Sub-field drive flashes the dots 8 times or more, meaning that the dots are flashing 480 times per second (480Hz) or more. (Example: 60 frames per second x 8 sub-fields = 480 flashes per second).

Underneath that is a wonderfully ambiguous graphic. It shows six picture frames. Underneath this row of frames it says ‘Each Original Frame has Over 8 Sub-fields. 480 Hz Sub-Field Drive (information is changed at each dot 480 times per second, or more)’. Underneath each of the frames is a set of eight rectangles, each with a letter. Under the first frame the latter is ‘A’ in each of those eight rectangles. Under the second it is ‘B’ and so on.

So the caption for the graphics implies that the ‘information is changed at each dot’ eight times per frame or more. But the little rectangles imply that the information stays the same. So what are we to make of this?

First, ‘field’ in this context would mean video field. An interlaced video frame consists of two fields. In reality, in the US system, most of the time the picture consists not of 60 frames per second, but either 24 or 30 frames per second. But, absent Blu-ray, this is delivered in the form of 60 fields per second. A field is half a frame. One of the fields consists of the pixels in every second row of the frame, while the other field consists of the rest of them. More about that
here.

So ‘sub-field’ would mean ‘below the level of the field’. Given that a field is largely a time-based object, a sub-field only occupies part of the time taken by a whole field. In the case of a 480 hertz sub-field, that is one eighth of the time (60/480 = one eighth).

Now, what follows is my guess work. I shall ask Panasonic to cast an eye over it and let you know in due course if they agree.

Unlike LCD TVs, plasma TVs work by stimulating some gas, which emits ultraviolet light, which in turn excites some coloured phosphor, which then glows. Compare with CRT TVs: these shoot electrons down a vacuum tube, which excites some coloured phosphor, which then glows.

CRT TVs from the early part of this decade had a problem: they were called on to handle different signal frequencies, and their method of operation didn’t lend itself readily to the task. Here’s what happens when you excite some coloured phosphor, whether with electrons or with UV light. Its light output ramps up very rapidly, and once the stimulation ceases, it fades away, somewhat more slowly.

With CRT TVs, each phosphor sub-pixel (a red, green or blue dot) was hit just once by a quick spurt of electrons. The electron beam would move on that the sub-pixel would have to wait for the next frame, one twenty-fifth of a second later (or one thirtieth in the case of US TV) before it would be refreshed. Ideally, this would have gone to a suitable output level instantaneously and maintained its brightness until the briefest of instants before the next spurt of electons came its way, switching off instantly and fully just before that moment. What in fact happened was it would ramp up to full brightness extremely quickly, and then immediately start to fade.

The precise rate of fading could be affected by the selection (and presumably processing and treatment) of suitable phosphors. If you had a CRT TV that refreshed itself every 25th of a second, then it would need phosphors that lasted a relatively long time. But the 100Hz TVs that appeared in the late 90s and early 2000s needed to have their phosphors die out in half that time (1/50th of a second), clearing the pixel for the next frame.

Some CRT TVs ran at 100Hz for standard definition signals, but could also handle high definition, in which case they would drop back to 50Hz. I hated HD on those TVs because it would flicker. The rapid fading of the phosphors, selected for 100Hz operation, meant that the pixels would fade noticably before each new frame.

Plasma TVs have basically avoided this problem. Until now I had assumed that it was because they would excite the phosphors in a pixel, and just keep on exciting them until shortly before the next frame was due. Now, with a bit if research, it is becoming clear what I should have realised earlier: Each pixel is refreshed a number of times during the 25th or 30th or 24th of a second occupied by any given frame.

This has a number of advantages. One is that it saves power: instead of maintaining a constant ‘on’ pixel, a pulse is provided to kick it on, and then further pulses to maintain its on state are provided as needed. Second, by using a short duration phosphor, chosen according to the pixel refresh rate, you can have a rapid turn-off of the pixel at the end of the frame display period, allowing the new frame’s pixel information to replace it with the very minimum of a ‘black’ period between the two frames. These TV engineers really are very clever.

Presumably, then, a slower than 480 hertz refresh rate used to be employed. When Panasonic went for 480 hertz, then it became something to boast about. 600 hertz makes it even more boast-worthy. But the reason would seem to be not what happens during the display of any given frame, but how quickly it can change frames when the time comes. A quick change means more precisely defined motion.

Posted in Equipment, How Things Work, Video | Leave a comment

Super-dedicated WALL-E

I was scanning the Blu-ray version of WALL-E with BDInfo and discovered that it is organised very differently to the US one. That one apparently has the movie, with a run length of 1:37:25 in a single file (20000.M2TS). The Australian one has, incredibly, fifteen playlists for the movie. These range in run length from 1:38:11 to 1:38:41. Each playlist lists 47 files. The longest file runs for 24 minutes. Most run from less than a minute up to 2 or 3 minutes. The ‘STREAM’ folder of the disc contains 172 *.m2ts files! The largest is around 6GB.

It turns out that the playlists specify the same files, then different files, then the same files, then different, and so on. I suspected that perhaps Pixar has actually gone to the trouble of re-rendering any sections of the movie with English text on the screen into various foreign languages.

The image to the right demonstrates that this indeed is the case. I grabbed all the ‘I’ frames from three files that fill the same space in three versions of the movie. This is the 17th ‘I’ frame from, in order from top to bottom, files 00093.M2TS, 50177.M2TS and 50199.M2TS. Note the different language shown on each screen.

Once again, Pixar’s attention to detail is impressive. And all those replacement bits must have contributed to the disc having nearly 9GB more data than the US one.

Posted in Blu-ray, Disc details | Leave a comment

Eight bits of noise?

In the previous post I asked at the end: ‘Is 24 bits it worth it?’ I want to do an extensive post on this in due course, but let me first make my point briefly. In that previous post I contrasted the 2032kbps average bitrate of the Australian (and UK, I learn) version of Slumdog Millionaire against the 3962kbps of the US version. Our version is 16 bits; their version is 24 bits.

I point out that to go up from 16 to 24 bit uncompressed PCM requires a 50% increase in space. But this increase in DTS-HD Master Audio requires a 95% increase in space. The nice thing about Slumdog Millionaire is that it is likely identical sound, aside from the bit depth, making it highly comparable. But to confirm, I have eleven other Blu-ray movies with English language 5.1 channel DTS-HD Master Audio sound with 16 bits of resolution. Their average bitrates range from 1834kbps to 2504kbps, and the average of the eleven is 2114kbps. I have four Blu-ray movies with English language 5.1 DTS-HD Master Audio sound with 24 bits of resolution. They range from 3849 to 4530kbps and average 4253kbps.

That figure is twice (2.01x) the average bitrate for the 16 bit movies, so that basically confirms the drop-off in efficiency with the compression algorithm when moving from 16 bit to 24 bit sound.

Double check and repeat the process with the Dolby TrueHD tracks I have scanned (omitting Gandhi and Gigi, because they are both pre-5.1 channel days even though they are rendered this way on disc). I have fourteen 5.1 channel movies with TrueHD in 16 bits and these range from 1376kbps to 2172kbps, and average 1668kbps. The seven equivalents in 24 bits range from 2535 to 3525 and average 3109kbps. That’s an 86% increase in size.

This has led me to think that there is actually limited value in using 24 bit sound.

In brief: the efficiency of lossless compression systems like DTS-HD MA and Dolby TrueHD depends, amongst other things, on the predictability of the data. The severe drop-off in efficiency when you go from 16 to 24 bits suggests that most of those extra eight bits are unpredictable. The most likely reason for the unpredictability is that their values are largely random. Random samples are noise, pure and simple. White noise to be precise.

Looking at it from another angle, the noise floor of the microphones used to record stuff is typically -96dB or higher, and actual (rather than nominal) signal to noise of much equipment would rarely be more than 96dB. The 17th bit on a 24 bit scale is at -96dB. It isn’t surprising that the bottom eight bits of 24 bit sound should mostly be noise.

Posted in Audio, Blu-ray, Compression | Leave a comment

24 bit sound vs 16 bit sound and DTS-HD Master Audio

I’ve run BDInfo on the forthcoming Australian release of Slumdog Millionaire on Blu-ray from Icon Film Distribution. This turns out to be a very different encode to the US version.

The main title details of the US version are here. The main title details for the Australian version, posted by me, are here. In brief, the US version gets MPEG4 AVC, we get VC1 (both have healthy bitrates in the high 20s of megabits per second). Australia gets a PIP video commentary which the US doesn’t get.

Both versions get DTS-HD Master Audio 5.1 channel sound. But the US version gets 24 bits and 48kHz, whereas we get 16 bits and 48kHz. The US version has an average bitrate of 3962kbps. The Australian version averages 2032kbps. Note: throughout this Blog post, ‘k’ equals 1,000, not 1,024.

I find those bitrates interesting. Are they comparable? What do they tell us about lossless codecs?

I’m inclined to think that aside from the 24 vs 16 bit thing, the two audio tracks are very similar, but are not identical. The run length of the US release is around 30 seconds longer than the Australian release. If you look at the chapter breakdowns at those links, you will see that chapters 2 through 27 are the same length, and for chapter 28, the last chapter, the Australian version is actually about a second longer. The major timing difference is accounted for by the first chapter, and this is most likely to be due to to different company logos at the very start of the movie.

Aside from the different sound of the logos, I’d be extremely surprised if anything was different in the source sound between the two versions. It is a very recent movie, so I expect the original multichannel PCM recording was used for both versions.

As I understand it, the only way to do DTS-HD Master Audio compression is to use equipment and software supplied by DTS, so there should be no differences in the encoding methodology. If all these are the case, then the overwhelming responsibility for the difference in bitrates for the two audio tracks would be due to the 16 vs 24 bit question.

Now 3,962 is nearly twice 2,032 (1.95x anyway). An uncompressed LPCM 24 bit, 48kHz 5.1 channel audio track runs at 6,912kbps. The 16 bit version runs at 4,608kbps. Unsurprisingly, the former is 1.5 times the size of the latter.

So why does the efficiency of DTS-HD MA seemingly fall off so much with 24 bits? I’m not sure. But let’s look at how DTS-HD MA works.

The lossless compression I know best is Dolby TrueHD. This is very similar indeed to the Meridian Lossless Packing used on DVD Audio, which Dolby licensed from Meridian and championed for that purpose. Audio typically doesn’t compress very well using ‘traditional’ computer compression processes (eg. WinZip), which largely rely on finding and eliminating redundancy. I have just dragged a 16 bit, 44.1kHz music file into a Zip folder and managed to reduce its size by 7%. Even our relatively inefficient compression of the 24 bit sound on this movie got it down by 43%. The 16 bit sound was reduced by 56%!

A 7% reduction is probably not worth the trouble. But 43% and 56% certainly are.

So how to get big compression factors? The trick used by MLP and Dolby TrueHD is to build in an algorithm that, based on the sound so far, deduces what the sound will be in the future. If a waveform is increasing, then it’s highly likely that in the next sample it will still be rising. The algorithm will do that, and set the next sample at a reasonable guess for how much it will have risen by, given the preceding samples. Except for some sudden transient, which the system treats as an exception to be dealt with by other means, this should give an adequate approximation of the sound.

By ‘adequate’, I do not mean adequate for listening purposes, but adequate for the next stage of the process. That stage tweaks the sample to make it accurate. Consider, instead of using, say, 16 bit samples to describe a sound wave you use 8 bit samples. The amount of data you would be handling would be halved. Normal 16 bit sound describes a series of sample sizes. But another way of thinking about this is that it describes a series of offsets: how far each sample diverges from a particular value. With uncompressed PCM, that particular value is zero. But with MLP and TrueHD, the value is different for each sample, and is derived from that approximation algorithm.

So MLP and TrueHD basically work by using a formula to guess what the sound will be, based on what it has been, and then a series of offsets to correct it. Because the guess is typically very good, the offsets are small and you can use 8 or 6 or 4 bits to communicate these, much of the time. Exceptions are provided for, of course, but most sound is efficiently compressed. And since the encoder and decoder both use the same algorithm for the ‘guessing’ part of the process, the reconstruction of the sound (guess + offset) is conducted perfectly.

That’s Dolby. How about DTS?

Both Dolby TrueHD (but not MLP) and DTS-HD Master Audio carry, on Blu-ray, a ‘core’ within themselves to cater for equipment that doesn’t support the new audio formats. Dolby TrueHD carries a Dolby Digital core (typically at 640kbps, but sometimes at 448kbps). DTS-HD Master Audio carries a DTS core. In every case I have looked at so far, the DTS core is a high bitrate core at 1,536kbps (sometimes reported as 1,509kbps). Most normal DTS tracks on DVD and many on Blu-ray use a half bitrate of 768kbps.

If your system will decode Dolby TrueHD itself, then the Dolby Digital core is totally ignored. The TrueHD component of the bitstream stands alone. DTS works a little differently. Presumably the DTS engineers thought to themselves: ‘If our audio tracks are going to be carrying a bit 1.5Mbps data load anyway, we might as well make use of it.’ Note, also, that DTS has always claimed that DTS is nearly lossless. That is, DTS thinks that much of the time the entire 5.1 channels of PCM can be completely and perfectly reconstructed from the 1,536kbps. I am certainly not competent to dispute this claim, although I would note that this is rather more likely to be the case with 16 bit than 24 bit sound.

So the standard DTS core forms an integral part of DTS-HD Master Audio sound. The decoder uses both the core and the rest of the bitstream to losslessly reconstuct the original sound.

Now we get to some educated guesswork on my part. It seems likely to me that DTS-HD Master Audio works in the same way as Dolby TrueHD: it uses relatively compact offsets to tweak an approximate representation of the signal into perfection. But whereas Dolby TrueHD uses a predictive algorithm, DTS uses the regular DTS core as its approximation (I suspect that regular DTS uses a much cruder predictive algorithm anyway).

Now let us consider the sizes of the tweaks involved. Both of our Slumdog Millionaire audio tracks have DTS cores of 1,536kbps. The 16 bit version requires tweaks of a modest 496kbps (2,032-1,536). The 24 bit version needs tweaks of 2,426kbps, even though the standard DTS core approximation is itself claimed to be 24 bits in resolution!

Is 24 bits it worth it? That’s a question for another day.

Posted in Audio, Blu-ray, Disc details | 5 Comments