DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   The View: Video Display Hardware and Software (https://www.dvinfo.net/forum/view-video-display-hardware-software/)
-   -   Why does 1080i exist? (https://www.dvinfo.net/forum/view-video-display-hardware-software/118515-why-does-1080i-exist.html)

Aric Mannion April 3rd, 2008 03:36 PM

Why does 1080i exist?
 
I've been told that 720p/1080i tvs have a native resolution of 1366x768. So it's pointless to view something at 1080i because the tv has to downscale to 720p and even de-interlace. If LCD TVs have to de-interlace than why are we still making interlaced video? Is 720p better than 1080i on a monitor that has a native resolution of 1366x768? (I have the option to play blu-ray in either 720p or 1080i -this is why I ask)

Daniel Browning April 3rd, 2008 04:52 PM

Quote:

Originally Posted by Aric Mannion (Post 853803)
I've been told that 720p/1080i tvs have a native resolution of 1366x768. So it's pointless to view something at 1080i because the tv has to downscale to 720p and even de-interlace.

Correct.

Quote:

Originally Posted by Aric Mannion (Post 853803)
If LCD TVs have to de-interlace than why are we still making interlaced video?

I'm not making interlaced video.

Other people do because of the limitations of their camera (e.g. most cameras lack 60p), their skill (can't hold the camera steady enough for 30p), subject matter (can't always get it right the first time in sports), or delivery requirements.

Quote:

Originally Posted by Aric Mannion (Post 853803)
Is 720p better than 1080i on a monitor that has a native resolution of 1366x768? (I have the option to play blu-ray in either 720p or 1080i -this is why I ask)

Generally, yes. Whichever one has the better deinterlacing and resizing algorithms.

John Miller April 3rd, 2008 05:17 PM

30 frames per second look jerky. 60 frames per second don't. 60 frames per second require a lot of bandwidth. 60 fields per second require the same amount as 30 frames per second. Progressive vs. interlaced = jerky vs. smooth and is why interlacing was originally invented in the early 20th century. Once 1080p/60 camcorders become affordable to the average consumer, interlacing will eventually disappear.

Same arguments apply for 25 vs 50.

720p/1080i TVs = 1366 x 768 isn't true. 1080p native TVs (1920 x 1080) can display 720p/1080i.

Daniel Browning April 3rd, 2008 05:26 PM

Quote:

Originally Posted by John Miller (Post 853871)
30 frames per second look jerky. 60 frames per second don't.

For some subjects, you're right. Amateur home video, unscripted sports events, pans that don't follow a subject, etc.

Glenn Chan April 3rd, 2008 05:36 PM

Quote:

I've been told that 720p/1080i tvs have a native resolution of 1366x768.
Some of them do... but other TVs have 1920x1080 pixels. In the future I'd expect more and more TVs to be capable of full HD resolution (but possibly cropping some pixels on the sides for overscan).

Quote:

If LCD TVs have to de-interlace than why are we still making interlaced video?
Possible answers:
- HDTV standards were established several years back. At that time, the CRT was king and the flat panel monitors weren't really that good for critical evaluation. They would evaluate systems by looking at a CRT monitor (which obviously is fine with interlace).
It was in 1990 when competing HD formats were announced.
http://en.wikipedia.org/wiki/Grand_Alliance_(HDTV)
And before that, people were even thinking about doing analog HD.

- At that time, the choice was mostly between 720p (60fps) and 1080i60 (29.97 fps). That's about the bandwidth that will easily fit over a single coaxial cable; to go higher bandwidth like 1080p60 won't do that. Presumably the cameras and recording formats at that time would also have had problems with higher bandwidth.

- Between 1080i and 720p, 1080i is capable of slightly higher spatial resolution. Because of interlacing, the camera has to filter/blur the image vertically a little bit to avoid interlacing nasties. The deinterlacer in the TV, depending on its quality, can also reduce resolution. (Right now most TVs will discard a field and interpolate, which loses half the resolution.) The engineers in favour of 1080i argue that deinterlacers will get better in the future.
Horizontally, most broadcasters broadcast 1080i at 1440x1080 raster.

The argument for 720p is that:
- It avoids interlacing artifacts. This is important since most HDTVs will be flat panels and *need* to deinterlace the image. Yves Faroudja, a designer of high-end deinterlacers, has said that "deinterlacing does not work!".
- Interlacing is not a very efficient way of compressing imagery. Instead of (filtering and then) discarding half the data, it is better to use other compression techniques like DCT, motion compensation, and other compression techniques. 720p uses bandwidth much more efficiently.
One EBU test also shows that 1080p50 looks better than 1080i50. (I believe their tests also show that 720p only looks better at low bitrates.)

Chris Soucy April 3rd, 2008 05:51 PM

Nope.................
 
720p/ 1080i/ 1080p sets can have almost any native screen resolution they like as long as they physically have a minimum of 720 horizontal lines. (see "Weasel Worded" below).

The most common "weasel worded" "HD READY" screens (here at least) are indeed 1366 X 768 or thereabouts.

The only thing those screens can display NATIVE is 720p, there is no 720i as far as I'm aware. 1080p (if available)/ 1080i / 576p/ 576i/ 480p/ 480i has to be scaled to play on these screens, with varying degrees of success.

To play 1080p/ 1080i NATIVE your screen must physically have 1080 (minimum) lines of 1920 pixels each. More is available, with 1920 X 1200 quite popular for computer monitors. (these are the only true (Full) HD screens).


CS

Robert M Wright April 3rd, 2008 11:41 PM

If I recall correctly, interlaced video came into being, because back in the 1930s it was cost prohibitive to produce a cathode ray tube that could "paint" all 480 lines of video fast enough to avoid a significant lag between the first and last lines being painted (similar to today's difficulties with rolling shutters on CMOS sensors). Since it wasn't practical to paint all 480 lines at once every 1/30th of a second, the solution that was come up with, was to paint every other line and then fill in the gaps a 60th of a second later. I believe that increased temporal resolution was actually an unintended benefit.

Aric Mannion April 4th, 2008 08:56 AM

Thanks, I guess I'll switch the blu-ray player to 720p instead of 1080i.
The more I think about HDV the more I like it. I have a sony FX1 and people constantly bash it because it is compressed, and 1080i (or atleast stretched to 1080i).
The only thing I don't like is I hear that it compresses more when I capture it to final cut. I'm going to try to learn how to avoid that extra compression.

Greg Boston April 4th, 2008 09:23 AM

Quote:

Originally Posted by Robert M Wright (Post 854029)
If I recall correctly, interlaced video came into being, because back in the 1930s it was cost prohibitive to produce a cathode ray tube that could "paint" all 480 lines of video fast enough to avoid a significant lag between the first and last lines being painted (similar to today's difficulties with rolling shutters on CMOS sensors). Since it wasn't practical to paint all 480 lines at once every 1/30th of a second, the solution that was come up with, was to paint every other line and then fill in the gaps a 60th of a second later. I believe that increased temporal resolution was actually an unintended benefit.

Well, you are close. It wasn't cost prohibitive in the normal sense. They simply didn't have circuitry designs that could achieve much faster than 30 fps. The problem was, in brighter environments where tv is viewed, 30p does have strobing issues. 24P is supposed to be watched in a darkened environment so that our 'persistence of vision' will fill in the space between frames. So they decided on the 60 interlaced fields per second to keep the motion sampling rate high enough to achieve smooth motion. In the CRT world, this isn't such a bad thing due to the persistence of phosphor which will still have a residual of one field as the next one is being painted on the adjacent lines. That helps your eyes perceive a full frame being displayed at once.

If you ever want to see the persistence of phosphor, shut off all room lights for awhile so that your irises open up. Turn on a TV for a few moments then turn it off. The screen will seem to glow for awhile.

As to the 1366x768 statement, it is true that low cost, budget LCD tvs have this limitation. Take a look at other LCDs of the same size with a much higher price tag and you'll see that they are full 1920x1080 panels. I had to point this out to my elderly parents as they wanted to buy an HDTV this past Christmas.

-gb-

-gb-

Glenn Chan April 4th, 2008 01:52 PM

Quote:

So they decided on the 60 interlaced fields per second to keep the motion sampling rate high enough to achieve smooth motion.
I don't buy that explanation. To my eyes, 60 interlaced fields / second looks weird. The motion doesn't look right.

One clear advantage of 60 fields/second is that flicker is pretty low. Whereas to do 60 frames/second would take up too much bandwidth. I think interlacing was a very good compression/bandwidth-reducing trick for analog, but now that we have digital compression techniques it's a very inefficient way of reducing bandwidth.

Robert M Wright April 4th, 2008 02:28 PM

Suffice it to say, 60i was an elegant solution to the technical challenges of the 1930s. Unfortunately, that legacy sort of haunts us nowadays.

John Miller April 4th, 2008 02:57 PM

You have to take into account something else, namely the original imaging of the video. Even though the compression schemes exist to manage the bandwidth of broadcast material and the processing power of the latest generation of multiprocessor computers can (given enough money) handle encoding true 1080p60 in real-time, affordable consumer camcorders cannot encode the analog information coming from the sensor(s). To do so, the electronics would have to sample process analog information at twice the rate. That significantly increases the cost - doubling performance in a single quantum leap ain't cheap.

Mark Keck April 4th, 2008 04:27 PM

A little history
 
Interlacing was invented by an RCA engineer named Randall C Ballard during the 1930's to improve picture quality without increasing the bandwidth of the signal. It should be interesting to note that at the time all video was 16 fps, so his interlacing scheme transmitted half the information at 32 fps, nowhere near the 60 fps used today... not sure when that got picked up. As mentioned previously all this was to work around limitations, not necessarily circuitry as a large portion of the interlacing was actually done mechanically.

For a little light reading, heres the original patent issued in 1939.

http://patft.uspto.gov/netacgi/nph-P...mber=2,152,234

Mark

Robert M Wright April 4th, 2008 04:40 PM

Quote:

Originally Posted by John Miller (Post 854435)
You have to take into account something else, namely the original imaging of the video. Even though the compression schemes exist to manage the bandwidth of broadcast material and the processing power of the latest generation of multiprocessor computers can (given enough money) handle encoding true 1080p60 in real-time, affordable consumer camcorders cannot encode the analog information coming from the sensor(s). To do so, the electronics would have to sample process analog information at twice the rate. That significantly increases the cost - doubling performance in a single quantum leap ain't cheap.

Actually, I believe the biggest bottleneck is getting in-camera compression that is efficient enough for recording 1080p60 inexpensively. Unless I'm mistaken, many cameras (like the HVX200) do the raw AD conversion at 1080p60, before converting to whatever format will be recorded. MPEG2 isn't really very well suited for compressing 1080p60. No problem getting MPEG2 fast enough to compress it, but the bitrate would be unwieldy to use with currently inexpensive recording media. AVC is coming along nicely though, and should be practical to deliver a little better than HDV like quality at HDV like bitrates (25mbps VBR for 1080p60 on cheap flash memory, like class-6 SDHC) in the not-so-distant future. If a major camera manufacturer decided to make a firm commitment to wavelet (rather than DCT like) lossy compression for acquisition, they could get there in pretty short order (assuming proper management of software development - piece of cake if it's done right).

Peter Wiley April 4th, 2008 09:36 PM

http://hd1080i.blogspot.com/


All times are GMT -6. The time now is 11:27 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network