Don't get confused, a HDMI port alone does not make TV high definition
Published on by Brian Butterworth on UK Free TV We've come across a class of devices that seems able to confuse HDTV buyers. These are called "upscalers" and seem to miraculously produce high definition pictures.
HDTV usually requires a high definition input port, either a HDMI or a DVI connection. These ports allow the higher definition modes, 1280x720 and 1920x1080 to be displayed.
Your comments: most recent posts are at the bottom
Your comments are always welcome. Please use the form below to add your thoughts or questions to this page. We will get back to you as soon as we can.
Charlie Friday 25 May 2007 5:10PM
Thats not quite accurate. Its true that simply stretching an SD signal to HD doesn't matter if its done on the device or the HDTV.
What matters is the actual hardware that does the scaling. A decent upscaler will run different algorithms on the image. The more complex the upsaling proccess the more prossessing power is required, making the device more expensive.
Some HDTVs, particually cheap ones have very poor quality upscalers built in.
It doesn't really matter if its the TV or some other device that does the upscaling. Its just that upscaling the signal before it reachs the TV lets you bypass the tvs built in upscaler.
I am sure for most people their HDTV does a good enough job. Of course as a manufacturer, claiming you Device upsacles, does of course let you charge more for it. Some are worth it and some arnt.
Charlie: I'm sorry, but there are NO "algorithms" that will increase the amount of information in the original signal. I've NEVER seen a HDTV that has a bad upscaler built in, it's a blitter, an age-old computing concept!
While I can't say Charlie is correct, the evidence of my own eyes on a Sony Bravia 32in TV shows that upscaled freeview via HDMI from a BT Visionbox looks a damn sight better than the Scart output of the same box. Same with a Sony upscaling DVD player. And it's not just because the HDMI is digital - 576p on HDMI doesn't look as good.
Upscaling definitely makes a difference but, as you say, it doesn't not make things HD as there is no new detail. It could be that HD-ready TVs apply less processing (sharpness etc) to a signal coming in a res near the native res on the screen (720p in this case), which makes it look better.
Brian, it is true that scaling is easy with today's technology and most HD ready TV's do it perfectly well. Does this mean that the processor boxes offered by DVDO etc. are worthless? Absolutely not.
Before an SD picture can be scaled, it must be converted from interlace to sequential. It is here that the cheap processors built into TV's tend to fall over. A critical test is fine detail in the picture that moves slightly e.g. if someone is wearing a patterned tie, or a handheld camera shot looking at fine detail. The detail will flicker and shimmer as the system tries to decide whether to treat it as moving or stationary. My DVDO scaler isn't perfect but it's MUCH better than my Sony Bravia working alone.
Good deinterlacing needs several fields feeding the filter simultaneously so that movement can be detected reliably. The DVDO needs, and includes, an audio delay to compensate for the video process. The Sony needs no such delay. You can draw your own conclusions.
As for resolution, a conventional 50Hz interlaced display will appear to have somewhat reduced vertical resolution. This is due to the combined effect of large area flicker, interlace flicker and visible line structure. This was established by subjective tests a long time ago. Removing these effects by deinterlacing and scaling will result in some subjective improvement in resolution, albeit only in the vertical direction.
Chris, I can confirm your findings about the Bravia. It seems to apply various amounts of peaking to the video response depending on which input standard it's receiving. Apart from the obvious distortion of edges, this also emphasises the MPEG compression artifacts with off-air digital reception. Only the 720p mode seems to work cleanly and transparently.
Ian: To be honest, I use Windows Vista with Media Center (Ultimate, but it's in the regular edition) with two DTT cards and a 500Gb USB2 drive, so I've got a perfect and DVI connected picture with a decent multi-GPU videocard and it looks amazing, even with Freeview. I can download HD stuff over the net if I want to use it properly. I've even got the Media Center remote that glows! If you want it to look good, think DVI from a Media Center PC.
'HDMI upscaling' is a complete scam: my Toshiba player offers three resolution levels with absolutely zero discernible difference between them, whatever the source material (oh, and it won't play the new HD discs either), and with the output through the HDMI socket to the TV's HDMI socket
I'd like to add weight to David Robinson's comments; to display 720/576i on a 1024p, the video must be upscaled and re-sampled. This does not, as Briantist correctly points out, increase the amount of information in the signal, it simply avoids ugly aliasing artifacts that would be present without this process.
However, the optimum point to do this is whilst it is still in the MPEG domain,and a good upscaler may use the motion vectors can help the interlace to progressive conversion as an intermeadiate field must be generated. If done after an intermeadiate analogue stage, the results can be quite disappointing, this is why upscaling in the DVD or STB & HDMI connection works so well. I've seen an ASDA 'ONN' LCD fed from a Panasonic upscaling DVD look superb, I've seen £1000+ LCD & plasma look c**p because the inbuilt upscaler has to make the best out of a SCART feed.
The high end Sony 100Hz CRT tellies had a 'i' to 'p' upscaler. They had some strange artifacts at times, but they are very comfortable to watch because of the high refresh rate.
Genesis Microchip/Faroudja video processing chip is reckoned to be the best upscaler
Even Blu Ray discs that are of old films that sell for 25 quid are just upscaled versions of the original film that is marketed as hd(was never filmed in hd) but still looks better than dvd.
Decent telly and decent source you can have a mint picture whether or not it classed as sd or hd who actually cares.
Don't forget to use decent cables cos standard cables are absolute crap.decent source sent to decent telly with crap cables are totally pointless.
I have sky tv with qed scartcable to sony rdr-hx860 which upscales to 1080i then to toshiba 32wlt66 with ixos hdmi and every channel has a mint picture because of the upscaling process.
My dad has sky hd and hd channels are mint but every other channel is crap because it does'nt upscale.
Andrew: Movie film has always been higher definition than TV. If the original print of the film was used to create the HD version, then it will be HD. This is quite normal, and the telecine process will produce a much better version than on DVD. TV programmes sometimes were shot on film, but this was not usually preserved as this would have been copied to videotape for editing. So, old films on Blu Ray at least can be HD.
Andrew: Also, there is no point whatsoever paying for expensive cables that carry DIGITAL signals. Either they do, or they do not. I also think you will find that the "every one channels is crap" because it is not in HD, not because it is not "upscaled".
briantist: i already knew that movies were much better quality than tv/dvd but they were never 1920x1080p(hd) and have been upscaled from their native resolution.
I also totally agree that expensive cables are waste of money but a £30 hdmi cable will provide cleaner sharper results than a £5 one.
I knew other channels were not in hd and not upscaled but all of my sky sd channels are upscaled by my sony RDR-HXD860 and are on par with sky hd channels(only compared with me dads) which proves(to me that is) that upscalers work to provide much better results at a fraction of the cost of going full hd which i think when analogue is switched off there is lot more bandwidth for full 1080p broadcasting which is when i will upgrade until then i'm well happy.
Look forward to more of your thoughts cos i reckon you know a thing or two.
Andrew: Even 35mm film resolution is way, way in excess of 1920x1080, most movies have been shot and printed on this film since 1892! This article puts it at 5300 x 4000 - this is the scale of resolution used for How many pixels are there in a frame of 35mm film? - and digital cinemas work at 4096×2160. I'm sorry but you are 100% wrong about the HDMI cable. The information is DIGITAL, just 0 or 1s. Either they will be transferred 100% correctly or they will not. The idea that the picture could be better or worse is an analogue idea.
Briantist: You even said it yourself either they will be transferred 100% correctly or not. Better quality cables will transfer more of the signal at 100%.Did'nt realise 35mm was so high quality and stand corrected.
Andrew: Either it's 100% of the data, or it isn't. Anything less is a useless cable. The 'signal' does not matter, as long as it carrys 100% of the data. They are 0s and 1s, you can't have "a stronger one" or a "stronger zero". I