Archives: May 2007
17/05 Can you see it?If you've ever looked into getting an HD television, you've probably come across the fact that "HD" can refer to pictures that are both 720 pixels high and 1080 pixels high. Most (perhaps all) HD services at the time of writing broadcast in either "720p" (images that are 720 pixels high, and scanned progressively) or "1080i" (images that are 1080 pixels high, and scanned interlaced). It so happens that video in these two display formats require (in very rough terms) about the same bitrate as each other for approximately the same quality.
I won't talk much about interlace here; suffice to say that it is a nasty and anachronistic way to reduce bitrate at the cost of video quality. In other words, it's a compression scheme, albeit one invented seventy years ago. In the intervening period we've invented far better methods of compressing video.
One argument that I occasionally hear is that "there's no point getting a display capable of 1080 line HD, because unless you have a massive screen, you won't be able to see the detail". My colleagues did a thorough investigation of the subject a few years ago, and determined that yes, people were incapable of seeing the extra detail in images higher than 1280x720 in resolution unless they had a screen bigger than 50 inches diagonal (at a typical viewing distance of 2.7m). But they also pointed out that unless you want to be able to see the individual pixels (and who does?), your television should have a higher resolution than the images it's displaying. So: if you're buying a television, get one with a native resolution of 1920x1080. And if you're planning to set up as an HD broadcaster? If picture quality is your priority, 720p will be perfectly adequate for today's viewers, and gets rid of all those nasty interlacing artefacts.