While high definition has become a reality for many consumers, the technical jargon associated with this exiting new technology is causing much confusion. Just as we were beginning to understand the differences between Blu-ray and HD DVD along comes a new high-definition format, 1080p.
But why do we need another high-definition format anyway? Many of us have bought our HD Ready screens and were ready to sit back and enjoy this new viewing experience, but now we are all wondering if we bought the right kit in the first place.
Many of the more recent HD Ready flat screens feature a resolution of 1,366x768 pixels. This will display the commonly used 720p and 1080i formats, although 1080i/1080p signals will be downscaled to fit. To display 1080i/1080p signals in their entirety, you'll need a screen with a resolution of 1,920x1,080 pixels, coined 'Full HD' by the marketing men.
However, just because a screen has 1,920x1,080-pixels it does not necessarily mean that it will accept 1080p input - so check before you buy.
Remember, 720p, 1080i, 1080p are formats in which 'Sources' of high definition content are presented for viewing on a particular output device such as your LCD/Plasma screen. The source could originate from your TV cable provider for example, or your xbox 360. To restate the point, 1080i/1080p needs a screen resolution of 1,920x1,080-pixels to display in its entirity, but you don't have to have a screen with this resolution to display a 1080i/1080p signal - lower resolution screens downscale the signal to fit.
Taking a step back, 720p and 1080i were initially set out as the two key standards for High Definition content, with Sky HD, HD DVD and the Xbox 360 supporting these formats. Any TV that supports 720p and 1080i is classed as HD Ready. Let’s take a step back for a moment and take a quick look at the development of TV technology to see how we arrived at these standards.
In a CRT display (the TV you grew up with), a stream of electrons is generated by a gun, and is scanned across the face of the tube in scan lines, left to right and top to bottom. The face is coated in phosphors, which glow when hit by the electron stream. A method of scanning was required that would reduce the transmitted TV picture's bandwidth and work in accordance with the electricity supply frequency (50Hz in the UK and Europe and 60Hz in the US). The result was interlaced scanning.
A method of reducing bandwidth was required because early sets were not able to draw the whole picture on screen before the top of the picture began to fade, resulting in a picture of uneven brightness and intensity. To overcome this, the screen was split in half with only half the lines (each alternate line) being refreshed each cycle. Hence, the signal is interlaced to deliver a full screen refresh every second cycle. So if the interlace signal refreshes half the lines on a screen 50 times per second this results in a full screen (or frame) refresh rate of 25 times per second. The problem with interlacing is the distortion when an image moves quickly between the odd and even lines as only one set of lines is ever being refreshed.
As TV screen technologies have progressed another system called Progressive Scan has also been developed. With progressive scanning the frames are not split into two fields of odd and even lines. Instead, all of the image scan lines are drawn in one go from top to bottom. This method is sometimes referred to as 'sequential scanning' or 'non-interlaced'. The fact that frames are shown as a whole makes it similar in principle to the way film is shown at the cinema.
At this point it is worth considering what we mean by resolution in relation to TVs;
Resolution: HD-Ready TVs need to be able to display pictures at the resolution set by the new standard. Resolution can be described either in terms of "lines of resolution," or pixels. The resolution you see on your TV depends on two factors, namely the resolution of your display and the resolution of the video signal you receive. Because video images are always rectangular in shape, there is both horizontal resolution and vertical resolution to consider.
Vertical resolution: This is the number of horizontal lines that can be resolved in an image from top to bottom. The old familiar CRT TV displays 576 lines, while Digital HD television operates at a resolution of either 720 or 1080 lines. This is the most important resolution as it is most noticeable to the human eye.
Horizontal resolution: This is the number of vertical lines that can be resolved from one side of an image to the other. Horizontal resolution varies depending on the source. The number of horizontal pixels is not quite so critical as vertical resolution as it is not as obvious to the human eye during normal viewing.
An analogue TV signal in Europe, where the PAL standard is used, has 625 horizontal lines of which 576 lines are displayed and the image (or frame) is refreshed 25 times a second. This is the standard we have been used to for years.
A High Definition Digital TV signal delivers significantly more picture detail and audio quality than a standard signal, producing pictures that are significantly better, sharper and clearer;
720p: 1,280x720 pixel resolution. High-definition picture that is displayed progressively. Each line is displayed on the screen simultaneously, therefore it is smoother than an interlaced picture.
1080i: 1,920x1,080 pixel resolution. High-definition picture that is displayed interlaced. Each odd line of the picture is displayed, followed by each even line, and the resulting image is not as smooth as a progressive feed. 1080i is therefore a more detailed picture suited to documentaries and wildlife footage, but less suitable for action-oriented material such as sports and movies.
1080p: 1,920x1,080 pixel resolution. High-definition picture that is displayed progressively. Each line is displayed on the screen simultaneously, therefore it is smoother than an interlaced picture. This is the ultimate high-definition standard -- the most detailed picture, displayed progressively.
There are two main formats for HDTV, namely 720p (i.e. a 720 line picture progressively scanned 50 times a second) and 1080i (1080 lines interlaced at 50 cycles per second). The picture resolution of a high definition digital TV is about 4 times greater than a typical 576 line TV picture.
The point here is that most high definition broadcast is in either 720p or 1080i, so not having a screen which is able to display 1080p may not be important to you. However, there are exceptions, and if you are a serious game player you will probably already know one of them, or to be precise two of them. The xbox360(with a little tweak) and the soon to be with us playstation 3 produce output at 1080p. Also, the new High Definition DVD format, blu-ray has also been designed for 1080p ouput. Is the difference worth the extra investment? Maybe, something you will have to judge for yourselves ...