View Full Version : Why can some HD TVs display 1080i but not 1080p?


Jack Kelly
December 2nd, 2008, 04:38 AM
A friend asked me this question last night. I couldn't answer and it's really bugging me!

Some (consumer) LCD "HD-ready" TVs claim to be able to handle a 1080i input but they can't handle a 1080p input. Why is this? Why can't they handle 1080p if they can handle 1080i?

Surely 1080/50i and 1080/25p both require the same bandwidth? And surely it's less computationally expensive to downsample from 1080/25p to the TV's native resolution than to downsample from 1080/50i (because in order to properly downsample an interlaced image you first have to de-interlace to produce a 1080/25p signal and then downsample the progressive-scan image).

Of course, most "HD-ready" TVs don't have a native 1920x1080 liquid crystal panel; instead they have something like 1366 x 768. Is the answer that these "HD-ready" displays actually don't deinterlace and then downsample an interlaced input; instead they just throw away one of the fields to end up with a 1920x540 signal???

Many thanks,
Jack

Chris Medico
December 2nd, 2008, 08:08 AM
Because 1080p isn't part of the ATSC interface standard.