View Full Version : 1080p a joke?


Marco Wagner
April 21st, 2006, 01:37 PM
I came across this article today. Some intersting commentary about the uselessness of 1080p. Whatdoyouthink?

http://theinquirer.net/?article=31089

Don Blish
April 21st, 2006, 02:29 PM
I imagine that virtually all 1080i/p sets sold henceforth will be LCD or Plasma panels which will use a 60Hz refresh (or 50Hz if the media demands). I also believe that all interlacing means is that each 60th of a second the odd or even rows will be updated. Sony's proof of concept BD-ROM I recall was done in 24p and Blu-Ray players will be expected to add frames to get to 30i or 30p depending on how the player is set (or it discovers from the HDMI dialog). I believe his comments on horizontal scan rates are entirely CRT centric.

Joshua Provost
April 21st, 2006, 03:43 PM
Hey, 1080p makes sense. LCD's and plasmas are progressive devices, so displaying an interlaced format requires some voodoo. You want a progressive format... but...

There is no such thing as 1080p right now. It's 720p or 1080i. If you had a native 1080p display, you're either upscaling 720p, or field doubling 1080i. So, what's the point?

Hey, the 1080p display may be compatible with whatever standard emerges for 1080p... or it may not. It's hard to tell, and may not support the refresh rate required at that time.

1080i is for interlaced displays (tubes). The big mistake forging HDTV with TWO "standards," and that either one of them was INTERLACED. Pointless, considering progressive video is backward-compatible with interlaced displays, but interlaced video is not forward-compatible with progressive displays without a lot of degrading manipulation.

Of course, those decisions were set in stone a long time ago... if you ask me, 1080i is the joke.

Marco Wagner
April 21st, 2006, 04:08 PM
Yeah I never quite understood that either.

David Heath
April 21st, 2006, 05:00 PM
There is no such thing as 1080p right now. It's 720p or 1080i. If you had a native 1080p display, you're either upscaling 720p, or field doubling 1080i. So, what's the point?
There's another option apart from 720p and 1080i, and that's 1080psf (psf=progressive, segmented field), and that is how much drama in 50Hz countries is now being made.

Briefly, the origination is true 1080p/25, true progressive and full 1080 resolution, though with the motion rendition of film. That is then recorded/delivered in an interlace compatible form, for full compatibility with a true interlace signal, whilst preserving the true progressive nature.

For a true progressive frame, the lines of a frame may be numbered and transmitted 1,2,3,4,5,6.......1078,1079,1080. Converted to psf, these identical lines are then transmitted 1,3,5......1077,1079 (new field) 2,4,6....1078,1080. Conversion between the two is obviously transparent - a matter of reordering lines, NOT involving any scaling or destructive manipulation.

1080p/50 should be the ultimate goal. 1080psf/25 and 1080i/25 in the meantime seems a sensible interim (chosen according to subject, drama or sport, say) - transmission wise they are equivalent.

Marvin Emms
April 21st, 2006, 09:12 PM
Joshua,

Agreed, 1080i is an abomination. Interlacing is an analog trick used to reduce bandwidth for a given static resolution, it has no place in a compressed digital world. You can compress a progressive frame so much more easily in a given amount of data than two fields of it seperatly and for MPEG, with picture frames and movement frames it makes even less sense.

Leo Pepingco
April 21st, 2006, 09:25 PM
Interlaced is no abomination....

Some of us who work in the Australian TV industry use interlaced because that is the medium to which television is delivered...

and I dissagree that Progressive is easier and better to compress. With Progressive, you record and store every frame. With interlaced, you only need to record the changes..... However, I can be wrong... But despite that, There is a reason why we have interlaced, there are some major broadcasting companies out there who would dspise Progressive based on the difficult nature it has on their signal and bandwidth.

Graeme Nattress
April 21st, 2006, 09:48 PM
Marvin is correct though. Progressive works with inter-frame compression like MPEG2 much better. You don't need to double the bit rate to get the same quality 1080p60 as 1080i60, plus you'll get a higher rez picture as no need for interlace filtering.

Interlace is poor compression by modern standards, and should be resigned to the dustbin of history.

Graeme

Marvin Emms
April 21st, 2006, 10:44 PM
Thanks Graeme but I'd go furthur,

"Interlace is poor compression by modern standards"

Its throwing away half the picture information to obtain half the data rate, and in that sense its not really compression at all. Any compression has better performance.


Leo,

"Some of us who work in the Australian TV industry use interlaced because that is the medium to which television is delivered..."

This is the same as arguing that since most of a channels input is delivered in standard def now, they should not bother moving to high def ever. Either you want to improve things, or you are happy with stasis.

"With Progressive, you record and store every frame. With interlaced, you only leend to record the changes."

This is nonsense.

"There is a reason why we have interlaced, there are some major broadcasting companies out there who would dspise Progressive based on the difficult nature it has on their signal and bandwidth."

You are missing the point, there is no reason why a progressive channel would need to take more bandwidth. 1080p24, 1080p25 or 1080p30 could be broadcast in the same space as 1080i48/50/60 and then displayed on a TV with a duplicate frame, the same thing as we see at the cinema. Fast moving images at worst could be broadcast at 540p and *displayed* at scaled up to 1080p, which makes more sense than displaying an interlaced 1080i signal at 50 fields a second. This would take the same bandwidth using the same compression system as interlaced. But going to the root of the compression system, there is no reason to only present the codec with 540 lines of progressive information, as 1080i is doing. More advanced compression system should deliver improved practical resolution given a 1080p input, even at 50 or 60 fps than a 1080i system can deliver in the same bandwidth.

1080i is throwing away visual information the CODECs never get a chance to compress. With a clever compression system there is no reason why a 1080p50/60 signal could not be compressed to exceed the performance of a 720p broadcast in exactly the same bandwidth, with nothing more than downsampling you can get a 720p50 picture from 1080p50 and then compress that to get potentially the same quality as a 720p50 broadcast, so a more advanced compression system should have no problems exeeding the quality (however slightly) of 720p50 in the same bandwidth given the 1080p50 image.

The thing that bothers me the most is that while computer technology, silicon fabrication methods, HD storage doubles in capability every 18months or so, TV is essentially doing the headless chicken dance due to legacy formats and snake oil. We had 1000 line TV being broadcast in the 1940's! Ok, it was 4:3 (As far as I am aware) we are now 60 years on. In that time computers have gone from ENIAC type buildings full of valves being clocked in the kilohertz range, to 64bit CPUs in the region of 5GHz, with total computation in the ballpark of one thousand billion times faster, and the cutting edge of TV is still 1000 interlaced lines. Ok, as far as the end user is concerned, its now in colour and in 16:9 but TV technology seems to move with glacial speed and if the past is anything to go by we'll be stuck with whatever finally emerges as the dominant broadcast method (In the UK its looking to be 1080i even for sport) for the 40 years it took to replace the last system.

The people that shape TV technology seem to lack vision. Its considered ok to produce something only fractionally better than the current system even though it will likley take decades before the next standard comes along. While most people in the industry recognise 1080p is the hottest thing since microwaved apple pie the TV manufacturers have responded by covering their products performance with lies ("Ready for 1080p" = The sony lie - the bravia has no 1080p input method). In a few years when 1080p capability either in 24/25/30 or 50/60 becomes the standard for optical sensors intended for HiDef cameras, and the data rates become feasable with off the shelf PC hardware the broadcast industry will be hamstrung by the limitations of the current interlaced format in its standards, its equipment and in the homes of the consumers.

Graeme Nattress
April 22nd, 2006, 06:51 AM
I certainly see what you're getting at. For instance, you can get a better picture in the same data rate by instead of doing say, 4:2:2 or 4:2:0 or 4:1:1, you just compress the Cb and Cr more strongly, but leave their resolution at 4:4:4. Again, it's the difference between compressing strongly, and just throwing away.

What people forget about interlace is that it must be strongly vertically filtered to stop flicker, and this kills resolution.

Graeme

Michael Devlin
April 22nd, 2006, 01:19 PM
While I agree with all of the technical arguments against interlacing, I have to admit that the end result is not always consistent with the technical theory espoused here.

Consider American Football on CBS (1080 60i) versus Fox (720 60P). In our case we are using an HD Tivo Satellite receiver (DirecTV, NFL Sunday Ticket HD). We watch the games on our post-production projector (NEC iS8-2K) which is actually a 2048x1080 DLP Digital Cinema projector, projecting on a 16' Stewart Films GrayHawk screen.

The CBS games (1080 60i) look infinitely better than the Fox games (720 60P). The 720P games look blocky and low rez, while the 1080i games are great. I don't see any more strange motion artifacts on CBS than Fox (there are plenty on both). If anything Fox has more annoying motion artifacts. In the early days Fox had better slow-mo than CBS, but now I can't tell the difference (except that CBS has higher rez).

We are running the 1080i from the satellite receiver through a high-end scalar (which de-interlaces among other things) and sending 1080 60P to the projector. Maybe that helps. Of course the scalar also does the best job it can up-rezzing 720P to 1080P.

Perhaps Fox is taking other short cuts that reduce picture quality, but at least for American football I will certainly take 1080 60i over anything else currently being broadcast.

Of course I would love 1080 60P broadcasts (or 4K 444 60P!), and agree with the comments about the slow TV industry. Of course, it is nothing compared to the slow film industry (like 24P is somehow a virtue!).

Another error in the article cited at the top of this thread. He says "there are no off-the-shelf broadcast cameras that can handle 1080p/60", which is not true. Sony's standard (and very successful) HDTV cameras (HDC-1000 & HDC-1500) do in fact support 1080 60P, although I am doubt if any major broadcaster is using them that way, except maybe for slo-mo effects.

Ron Evans
April 22nd, 2006, 01:35 PM
Greame, can you explain what you mean by:-

"What people forget about interlace is that it must be strongly vertically filtered to stop flicker, and this kills resolution."

I may be incorrect in assuming that interlace signals and CRT displays are really part of a system. The combination of our eyes/brain and the CRT fools us into thinking that the image is detailed, lacks flicker etc. Inputing this interlaced image onto a progressive display results in having to manipulate the image into a progressive format. The electronics involved in doing this governs how well the picture looks but is likely to be worse than on a CRT. With a progressive image its possible to take a frame and analyse quality etc. Grabing a field from an interlaced image is only half the vertical detail and yes the other half is thrown away for this scan ( and the alternate half for the next scan) and as such trying to grab a frame from an interlaced image is more a measure of how the deinterlacing algorith performs than the image itself!!!
I think the downside at the moment is that for broadcast TV a 4x3 CRT will likely give the best picture and for watching DVD's a 16x9 progressive will be the best. Means one really needs two TV's!!!!

Since I really dislike the stutter of the film look I can't wait for 1080p60 myself if I am still around if and when this happens.

Ron Evans

David Heath
April 22nd, 2006, 01:51 PM
The CBS games (1080 60i) look infinitely better than the Fox games (720 60P). The 720P games look blocky and low rez, while the 1080i games are great.
Michael, I suspect the best looking games are the ones where your team wins! Joking apart, I obviously can't comment on these particular examples, but differences may be caused by all sorts of factors, from the exact cameras used, to individual stations bitrates, to how individual displays deal with one type of signal or the other.

Assuming your observations are accurate, it is conceivable that the differences are caused by other factors, and the interlace/progressive difference between adds little to which looks better, one way or the other.

Michael Devlin
April 22nd, 2006, 02:16 PM
David I agree that there are many other factors besides progressive versus interlace. The whole chain as deployed in the real (economic) environment is what matters. Now there are rumours that DirecTV is going to be down-rezzing everything to get more channels out of their satellites. Oh well.

Graeme Nattress
April 22nd, 2006, 04:10 PM
Well, I have some FOX native DVCproHD 720p60 on my system, from when they were using Film Effects, and it's by no means blocky or low rez. It's very nice indeed. You can't really compare how two totally different broadcasters do things, because there are too many variables.

One may look better in your home, but because you don't know the route it took to get to your TV, and also, how your TV handles progressive and interlaced, you can't compare. I understand, that you want to say, "but I have to live with the practical reality of how it looks in my home", but that's not in doubt. What is the issue is showing how a broadcaster could do better by ditching ancient interlace for better, more modern technology.

As for interlace filtering: When the camera captures the image from the CCD, the vertical resolution of each field is filtered down to about 70% of the full vertical resolution to stop interlace interline twitter artifacts. That means, a 1080i system doesn't have a vertical resolution of around 1080, but more around 760 lines. So, each field has about 380 lines resolution, not 540.

A true progressive system of 720p has a vertical resolution of around 720 lines, and a 1080p system around 1080 lines - that's quite a bit more resolution than the equivalent interlaced system.

Graeme

Adam Keen
April 22nd, 2006, 06:38 PM
I can't believe news comes from blogs. I think the article writer is trying to be hip by calling it a "bog", or just can't proofread, or lazy.

J. Stephen McDonald
April 22nd, 2006, 10:48 PM
I recently obtained some backdoor information from the maker of my CRT HDTV, that isn't ordinarily made public. It turns out, that although their CRTs are promoted as using 1080i, that the image that is shot on the screen is actually 720p. The dot-pitch of the screen mask and the phosphor dots wouldn't support any more scanning lines, anyway. So, the 720p scanning they use, produces a better image, than if they tried to shoot 1080i on a screen that was inadequate for it. There are many other mysterious and often undisclosed details like this for most HDTV monitors. The pathways from the tuners or inputs to the screens follow different routes. Some of them work better than others, but whether a TV is labeled 720p, 768p, 1080i or 1080p or whatever, doesn't tell the whole story about how the image is actually displayed.

Marvin Emms
April 23rd, 2006, 09:49 AM
That is something worth looking into as, in general, 720p TVs display 1080i by simply expanding each 540 line field into a 720 line frame. In other words, treating 1080i as 540p.

You could buy a 1080i TV and end up with sub 540 line performance for films.

I don't suppose you'd give us the model of the TV that does this trick?

Have you verified the TV is really only scanning 720 lines?

Graeme Nattress
April 23rd, 2006, 09:54 AM
Turning 1080i60 into 540p60 is a very easy trick to do for a TV manufacturer. That's because doing decent quality deinterlacing is hard, as is picking out whether an image needs deinterlacing (if it's 30p in 60i) or needs pulldown removing.

However, why this is done is not really the TV manufacturers fault, but the people who insist on making interlace cameras and tape formats and broadcasts when the first thing that the HD standard should have done would be to abolish interlace to the bowels of history.

Graeme

Paulo Teixeira
April 23rd, 2006, 03:08 PM
I find it mind boggling that there are so many articles that are trying to prevent people from buying 1080p TV sets. I would hope it isn’t because Sony is backing it like they are with Blue Ray. On the Blue Ray side people are saying that it will be like BETA all over again when there are many companies backing it as well. 1080p is here to stay since Sony is releasing all future TVs on 1080p and even Panasonic is excited about their 1080p plasma TV that should be out shortly.

The way I see it, a 1080 60i signal should look a lot better on a 1080 60p TV than it would on a 720/768 60p TV, although I do see a problem when trying to show a 720p signal on a 1080p TV. The solution would be to have a 720p rectangle inside 1080 lines. This way you will see black bars not only on the top and bottom but also on the sides. At least that would prevent you from buying 2 TVs.

J. Stephen McDonald
April 23rd, 2006, 06:47 PM
I don't suppose you'd give us the model of the TV that does this trick?

Have you verified the TV is really only scanning 720 lines?

My CRT-HDTV is a JVC AV-30W585. No, I haven't verified that it actually scans 720p. They may well have just been blowing smoke with that brief answer about it having a 720p resolution. The picture it displays is very sharp and was better than the dozens of other CRTs that I considered when I bought it and even looked slightly sharper than the Sony fine-pitch models. When I get my own HDV camcorder and a resolution chart, I should be able to determine these screen specs on my own.

Marvin Emms
April 23rd, 2006, 11:31 PM
Paulo, Whoah their, you have the wrong end of the stick.

The point is not that 1080p is in any way a bad thing. The point of the article is that 1080p by most TVs that claim to support it is a lie. Even Sony who are pushing 1080p for Blue-ray are lying when they say their Bravia, which genuinly does have 1080 lines of resolution is 'ready for 1080p' - the very best it can do is 1080i with internal processing.

For displaying 720p on a 1080p set, its fairly trivial just to stretch the image electronically. Don't confuse very easy pixel operations like interpolation (Which produce high quality results), with motion adaptive warp and deinterlace algorithms required for 1080i to display as 1080p even half decently.

Graeme,

I think the blame needs spreading around rather more than that. HDMI the primary standard for getting HD signals to a TV digitally did not force 1080p60 as a requirement and in many ways was not aimed at such high performance at launched. They are now increasing the data rate it can handle but the damage is mostly done, so chip makers didn't support it, and TV makers didn't. Now that 1080p is suddenly the hot thing in media TV makers feel happy about labeling their equipment 1080p ready, a term that has no real meaning, even if they arn't 1080p compatable at any frame rate but simply deinterlace 1080i. Makers of HDTV cameras have long been releasing products based on low resolution chips, upscaling to a 'recognised' format and calling it HiDef. Its still embedded in the industry that analog tricks such as recording a downsampled image, and replaying with interpolation are acceptable even though everything is moving into digital.

The computer industry has had to turn on a dime, the TV industry is an oil tanker. When 405 line was replaced here the major consideration was that it be slightly better than than the new US NTSC standard of 525 lines. So we ended up with 625 lines, 576 of those visible. It didn't seem to matter that 405 line TV lasted for 40 years and PAL that replaced it was not even double the number of visible lines, barely 50% more. Now PAL/NTSC are 40+ years old, 1080 is almost double the number of visible lines of PAL but still interlaced, 720p is progressive but only 50% more lines than NTSC and only 25% more than PAL.

The problem is noone who contributes to the standards or the broadcasting laws seems to be able to see more than a few steps in front of their feet, much less the 40 year road ahead of HiDef TV. Even the BBC are arguing with the average eyesight of viewers and screen sizes being mostly <40inch that they shouldnt have to broadcast in more than 720 lines. With blue semiconductor laser technology taking off, in the next decade we may see laser projectors that alow us to turn an entire wall of a room into a bright TV, should that happen standards like 720p and 1080i are going to look downright dissapointing.

You could also blame the public, that want HDTV before the manufacturers have geared up to do it properly and will kiss the ass of 540p and call it icecream.

Simon Wyndham
April 24th, 2006, 01:03 AM
Depends if you are talking 1080p at 60fps, or 1080p at 24/25fps?

Using 25p as an example because it is easier, if a movie is on the disc that was shot on film etc then each alternate field will be from the same frame. A progressive TV could take those alternate fields and combine them for a true 1080p image. This is exactly how we have been displaying true progressive scan from standard DVD's, and I see no reason why it should be any different for high def.

1080i is just the carrier. 1080p at 60fps is a different matter though.

Marvin Emms
April 24th, 2006, 01:51 AM
If a TV cannot accept a 1080p25 input, only 1080i50, I do not think it should be marketed as a 1080p TV. Even if it has 1080 lines and deinterlaces.

The carrier is important. Feed 1080i50 to a 720line TV and you will be lucky to end up with more than 540 real lines of resolution with most TVs. Certainly the full 720 is a pipe dream without the same level of deinterlacing currently in the most expensive 1080 line TVs. A special case of 1080psf could be supported inside 720p TVs fairly easily, but that would require a TV that was 1080p in everything but final display and its easier and cheaper not to. A 720p TV that supported 1080p25 though would have to be 1080p internally, downsampling to 720 would be fairly trivial and there is no reason the result would not be close to 720 lines performance.

To even have 1080p50/60 as a future option support in the TV hardware must be made mandatory. Failing to do this has kneecapped 1080p either in 24/25 or in 50/60 flavours. Furthurmore 1080psf is not automatically equivelent to 1080p when compressed, though the MPEG standards can rejoin fields to compress whole frames, most of a regular channels broadcasting will be done with 1080i cameras, there is little expectation the encoders will be set up optimally to broadcast film media. Satellite brodcasters in the UK and US have brought new lows to to the expectation of Standard Def, with 488x576 being not unknown for PAL broadcasts in the UK, even in anamorphic widescreen and with the standard for SD DirecTV broadcasts being 480x480pixels.

Even the 540p50 most sports viewers will end up seeing in the UK as a result of Sky's desision to broadcast sports as 1080i, and the majority of affordable TVs being 720p, will look pretty good in comprison to the interlaced 544x576 resolution most Sky channels use.

Its just dissapointing that these half hearted measures will be stuck with us for so long.

Ron Evans
April 24th, 2006, 06:54 AM
I think once again the limits of technology for the price people are prepared to pay will limit what users get, mainly because there is a requirement for one size fits all. We have broadcast and film industries with this mind set when the public are embracing an age of choice in media. Again an industry out of touch with their clients. It used to be that almost everyone I new had much the same TV. Now it is very different. Some have fancy projection TV's in a special room, most have many TV's and still a large number have the same old TV with a VHS player--not even a DVD player!!! Their kids have cell phones with mp3 and MPEG4 video, iPod's and watch most programs on the PC. IF the industry was smart( !!!!! ) they would choose an aquisition format that could easily be transformed into any output format they choose with varying price points. You want cheap you get MPEG4 at 120pixels etc you want the best you get 1080P60 or greater with full PCM 6.1 sound etc. That way users would be able to choose the output format appropriately.

Simon I understand what you are saying about 25P etc but this is the very thing that produces the stuttering that I detest!!! It is electronic emulation of a two blade film projector but the original acquisition is the problem. The frame rate is too slow for motion. If the original was shot at 60 or some easy multiple to get the effect later, then the opposite could be done for people who really like the stutter of the last century, just leave frames out rather than repeat frames as we do now !!!! This could be done in the player or even the TV. There are colour effect/temperature choices now on most TV's adding frame manipulation could be another choice to produce the "old film look"

Ron Evans

Marvin Emms
April 24th, 2006, 07:21 AM
Ron,

Ive never really noticed a problem with 25fps on TV and thats only 4% faster, but then I'm in a PAL region so we get 1 genuine cinema frame in 1 TV frame cleanly. Unless its been shot on video with too high a shutter speed, and then movement can look nasty. Could the artifact you are describing be due to the 3:2 pulldown used to display 24fps material on 60 fields a second NTSC?

Graeme Nattress
April 24th, 2006, 07:58 AM
Simon's right that p over i, ie 25p embedded in 50i should go through all the stages of camera, tape, edit, TV, eye, and look progressive to the viewer without getting nasties. However, when a TV says 1080p, it should be meaning it supports 1080p50 or 1080p60, not the above example of 1080p25 embedded in 1080i50, as really, it's then just a 1080i display.

Yes, the blame can be layed at a lot of places, but really, it's with broadcasters that insist on interlace. They're wrong, but you try telling them that!

As for 720p, done properly, would give the vast majority of home viewers the best all-round experience as it compresses easier, is progressive just like their flat screens, and has just the same vertical resolution as 1080i.

Graeme

Ron Evans
April 24th, 2006, 08:03 AM
Although I live in Canada now ( emigrated from UK in 1970 ) I return frequently and it usually takes me two or three days before the flicker on the TV sets goes away!!! For me it isn't the TV system that I have a problem with it is with the film frame rate itself. Its too slow, just like 50hz is really too slow a refresh rate until ones brain discards the effect. Film in the cinema stutters this is just compounded when transmitted for TV or put onto DVD's. This is a well know issue and results in limitations to the shot angles etc that are used in film to reduce this effect. I shot in film before moving to video when I could afford it. Now I look forward to the next improvement. Smoother motion, more resolution, more latitude. VIdeo for me is being there. I am just advocating an approach that acquires a prestine original that can be degraded anyway one would like after. That way chase sequences could be smooth at realtime and dreamy sequences could have slowed frame rate, even smearing etc. Right now film approach leads to fixed shots that exploit the resolution and colour of the film stock but panning has severe restriction because it magnifies the frame rate problem. Has to be very slow( with plain backround) or very fast.Tracking shots of people need to have shallow depth of field because the backround would stutter badly if everything was in focus etc. etc.

Ron Evans