Demystifying High Definition Television
Written by John Groves
1080i, 720p, 1080p and now 4K but what does it all mean and will I be able to tell the difference and what TV should I consider buying to get the best possible TV picture?
Well this is one area where size really does matter. The size of your screen that is!
Let me start by explaining the difference between the different formats.
Every TV image is made up of pixels and naturally the more pixels that make up the image the better the quality (or resolution) of that image. The larger the TV screen you have means that more pixels are required to provide a better image resolution on that screen. Liken it to using a magnifying glass on a printed newspaper, you can see all the dots that make up the image. So for any HDTV under 32 inch, the difference between 720 and 1080 is unlikely to offer any real noticeable comparison. However if you have a 48 inch screen then this is where pixels really do count. Actually not just pixels but also the number of frames shown per second also becomes important consideration.
UK broadcast TV images are shown at 25fps (frames per second) and each frame is transmitted to your TV as two alternate fields making up half the picture which are then combined together by a process called interlacing ( i ). So a HDTV picture which is 1080i comprises of 1920 × 1080 pixels interlaced to make up a single frame. 25 of these still frames are displayed every second to create the perception of movement. On the other hand progressive TV pictures do not have interlacing but display each complete frame at 25fps, but because there is no interlacing it means that in effect you are seeing twice the resolution of that of the interlaced picture. This improvement in the overall resolution means that a similar image to 1080i can be produced using only 1280 × 720 pixels, known as 720p.
A TV set that can display 1080p has the best of both the above showing more pixels than 720p (1920 × 1080 pixels) and at twice the frame rate of 1080i. In fact in these TVs 720p images are automatically up-scaled or up-rezed (resolved) to 1080p with minimal loss of definition.
When it comes to discernable picture quality you should also take into account how far you sit from the screen, the ambient viewing conditions (i.e. lighting in the room) and how good your own eyesight is. Generally, if you sit more than 10 feet away from your TV, and your display isnâ€™t bigger than 50 inches diagonally, you won’t be able to tell the difference between 720 and 1080.
Comparative resolutions between standard definition and ultra-high definition
But what is 4K? This is the next generation of high definition television which has been titled Ultra-High Definition. It displays images at a resolution of 3840 × 2160 pixels providing images of 8.3 megapixels. This means that UHD has twice the resolution of HDTV allowing for much larger screen sizes to be manufactured. Sony, Samsung and LG already have UHD TV sets for sale although the price ticket is also ultra-high at between £4,000 – £6,000.
In reality it will be some time before broadcast TV will transmit 4K TV as this means a substantial overhaul of the digital transmission system to handle the considerable extra bandwidth required. Work began recently on finding a suitable codec which will offer high compression and low(ish) loss to handle the extra data required to transmit such pictures however this will still require substantial investment in upgrading the whole transmission pathway. Meanwhile, the next generation of high capacity Blu-ray players are being developed to enable UHD films to be released for the home cinema market.
So what after 4K? Sharp are now working on production of the 8K “super-high vision” screen showing 7680 × 4320 pixels (33.2 megapixels). A prototype was demonstrated at a recent trade show and the Japanese broadcaster NHK are planning to use 8K TV at the 2020 Tokyo Olympic games. But the big question is, where in your home will you place your 85 inch TV?