Monday, May 02, 2005

The Road to 720p

From Tom Norton's report on Home Entertainment 2005:
Since most TVs and projectors today operate at a native rate of 720p (slightly different in the case of some flat panels), a 1080i high-definition source must be converted to 720p for display. Silicon Optix claimed that most TV manufacturers do this by using just one 540-line field from each 1080i frame, scaling it up to 720p, because this requires much less processing horsepower than deinterlacing 1080i to 1080p and then scaling to 720p. I reported this technique first in my review of the InFocus ScreenPlay 777. (So far I have not seen it mentioned in any other publication.) This essentially limit's[sic] the resolution of such a source to 540p, which, technically, is not high definition.

In case you're wondering, the ScreenPlay 777 is a $20,000 front projector which uses 3-chip DLP, widely regarded as the best consumer HDTV technology available today. Here's what Mr. Norton review had to say about its scaling methods:
For conversion to the native 720p resolution of the projector, 1080i material undergoes an interesting process. Consider each 1080i frame as two 540p fields. Each of these 540p fields is converted to the 720p native resolution of the 777. These upscaled fields are then displayed sequentially. One could, I suppose, make the argument that this potentially reduces or eliminates the temporal (motion) problems inherent in 1080i material. One could also make the argument that it eliminates any spatial (static) resolution advantages that 1080i has over 720p—or indeed over even native 720p sources. But since both 720p and 1080i material looked superb on the 777 (more on this a bit further on), I haven't yet considered the implications of the process too deeply.

But wait, there's more! Prompted by this controversy, a slashdotter expounds upon spatial vs temporal resolution:
1080i is 1920x1080 @ 59.94 fields / second, meaning at any one instant in time, you're looking at a 1920x540 image made up of every other line of the picture (the odd fields, if you will.) Then, ~1/60th of a second later, you see the even fields. 720p is 1280x720 @ 60 FRAMES per second, meaning at any given instant you're looking at EVERY field of the image...not just the odd or even fields. If you were to try and take all 1080 lines from the original signal, they wouldn't really map properly to 720 at any given second because half of data would be from that same ~1/60th of a second later. Scaling the fields up is really the best way to go, at least for stuff that's been shot interlaced.

Is your brain hurting yet?

The next big thing in HDTV is 1080p, the highest possible resolution supported by the current standards. But even though several manufacturers are working on affordable 1080p displays, I've yet to hear any broadcasters talking about 1080p. Which means that when you get that astronomically overpriced 1080p HDTV this Christmas, you'll still be watching video that's been de-interlaced-- even if it was shot in 1080p, it'll have been broadcast in 1080i, which means that it lost half the data before transmission, and your TV is interpolating to re-create that data for display.

Of course, unless you're a hard-core videophile, you probably won't care. Sure, most people can tell the difference between SD and HD, and at least subconsciously sense that non-interlaced displays are better (my wife, who has a penchant for coining her own terminology, calls them "crispier" than interlaced), but just don't think it's that much better-- certainly not to spend over a thousand dollars on one TV set.

When consumers are willing to tune in to static-laden UHF stations and spend most of their 7+ hours of TV watching per day on reality shows, and most HDTVs are big-screen monsters marketed as luxury items instead of appliances, do you really expect the FCC to be able to flip the switch to HD just 20 months from now? Answer: No.

No comments: