DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Sony XDCAM EX Pro Handhelds (https://www.dvinfo.net/forum/sony-xdcam-ex-pro-handhelds/)
-   -   XDcamEX vs JVC GY HD250 (https://www.dvinfo.net/forum/sony-xdcam-ex-pro-handhelds/100357-xdcamex-vs-jvc-gy-hd250.html)

Steven Thomas November 2nd, 2007 10:54 AM

Quote:

Originally Posted by Thomas Smet (Post 769050)
I bought my parents a 32" HDTV and when sitting on their couch 480p DVD's look just as good as HD channels. Of course the closer I get the more that changes but bumping up to 720p gives that little extra boost for when you are really close.


Of course, distance is a perception factor, but it sure looks "day and night" to me when switching from the same movie playing 480P DVD, verses HD.
Looking at a 50" plasma from about 15 ft.

It's really not until you can do a direct comparison where things become more obvious.

Jody Eldred November 11th, 2007 02:55 AM

Quote:

Originally Posted by Werner Wesp (Post 768334)
Sometimes I wonder...

How many people actually know that 720p50 is far superior to 1080i50? Or how many even know that 1080i is NOT full HD? And yet - it is just a matter of the easiest mathematics:

50 x 1280 x 720 pps or 50 x 1440 x 540 pps is kinda obvious. (and that's not even taking into account the interlaced filter you need to keep the image from being jittery)

Not to mention all HD screens nowadays are progressive, so you can not watch 1080i optimally.

choosing 1080i and 720p might be difficult, but not as an aquisition format: you can aquire 720p and make perfect slow motions and whatever you want to do and then convert it to 1080i if a client asks for 1080i. If you would choose 1080i as an aquisition format, you can't make any serious slow motions (they'll have almost half the resolution and just half the temporal definition) or any other effect that uses deinterlacing. Furthermore, you cannot supply a client with seriously good looking 720p. Pretty simple if you ask me. But some people will be holding on to 1080i, and as a mathematician and image processing expert, I can only say: I suppose they're let by the fact that 1080 sounds larger then 720 ... ?

Anyhow. That's not even taking into account that the MPEG2 compression scheme has no higher order for interlaced images, thus losing efficiency in 1080i. 1080i in MPEG2 at 35 Mbps can't be compressed better then 720p at MPEG2 at 20 Mbps. I'm not especially promoting JVC here (although I know they have the better product here), but I cannot agree to Sony for what they are doing: They are making XDCAM look like it is a higher end format, but any scientist (as I am) know they are marketing HDV on a different format: it is also 420 - 18 and 25 Mbps is HDV. 35 Mbps is just SLIGHTLY superior standard Z1 HDV. It is obviously all long GOP. XDCAM HD is HDV in 18 and 25 Mbps mode and it is just very slightly better in 35 Mbps, while retaining all disadvantages for it not to be a high end format: long GOP mpeg2, interlaced, just 35Mbps in interlaced, 4:2:0, ....

Quality of the camera and lens, we'll have to see. If the EX wanders onto my desk, and I want to get the best out of it, I'm recording in 720p. Needless to say - that's what I do with the Panasonic HPX-500 that I have as well and with the JVC GY-HD201 (athough, there I have no choice).


Sure is a lot of math being talked about here. Strange, because not too many viewers watch TV with a calculator and test equipment in their hand...

;-)

Here's a simple test for you:

Shot a scene in 720P, and upconvert it to 1080i and 1080P.
Shoot the identical scene in 10801i and 1080P.
Compare the two.
Native 1080 will CLEARLY look superior to upconverted 720.

Downconvert the 1080 footage to 720.
Compare it to the natively shot 720.
It will match or look superior to the native 720 footage.

Sorry lads. The proof is in the pudding, not in the mathematics.

I'll put the XDCAM EX against any 720P camera out there.

Tim Polster November 11th, 2007 08:42 AM

I think Werner still has a valid point.

Aquiring in 60p gives one more options in post if those options are needed.

Jody, you are correct as well, 1080p will have more resolution than 720p, therefore the chance to show more detail.

But I would be careful using 1080p and 1080i in the same sentence as 1080i & 720p are very similar.

So yes, shooting at 24 or 30fps, 1080p would be a great choice, so the nod would go to the EX over the 720p cameras.

But, only the video folks will notice the difference. odds are "normal" viewers will not see much if any difference.

David Heath November 11th, 2007 06:31 PM

The EBUs stance is not that 720p is better than 1080i, period, but rather that it COMPRESSES far better, being progressive. And is therefore more suitable as a broadcast standard. That's a different matter to which may look the best straight out of the camera.

I'll accept they're right as far as it goes, but the problem with their stance is that it takes no account of 1080p/25 (broadcast psf). Strange, as so much material is made and transmitted that way.

Thomas Smet November 11th, 2007 06:59 PM

1 Attachment(s)
Quote:

Originally Posted by Jody Eldred (Post 773534)
Sure is a lot of math being talked about here. Strange, because not too many viewers watch TV with a calculator and test equipment in their hand...

;-)

Here's a simple test for you:

Shot a scene in 720P, and upconvert it to 1080i and 1080P.
Shoot the identical scene in 10801i and 1080P.
Compare the two.
Native 1080 will CLEARLY look superior to upconverted 720.

Downconvert the 1080 footage to 720.
Compare it to the natively shot 720.
It will match or look superior to the native 720 footage.

Sorry lads. The proof is in the pudding, not in the mathematics.

I'll put the XDCAM EX against any 720P camera out there.

Ok, and sorry I do not agree at all. A lot of people enjoy 720p broadcasts and have never once complained watching shows in HD on ABC or FOX. I'm talking about film source shows as well such as Grey's Anatomy and other drama shows.

Here is a image I made exactly like you said I should test it. I took a Canon D20 still image and cropped it at 1920x1080 pixels. I then down scaled a 720p version and scaled it back up. I faked a 1080i version by adding in a slight softness for reducing the interlace flicker and low pass filtering. Sure maybe true 1080p when watched as true 1080p is sharper but it is not "clearly" better and only tech heads watching it on a computer monitor would ever notice.

This image does not give any numbers at all because I do not use numbers. I use my eyes and my eyes tell me that at the end of the day there is very little difference between 1080i and 720p. If you have a better example please show me because no matter what type of images I start with be it 3D rendered, digital still or actual 1080i still shots from HDV I find always the same to be true. It isn't exactly fair to compare 720p from DVCPROHD tape either since it only uses 960x720 pixels. We are talking true 1280x720 pixels.

Evan Donn November 11th, 2007 08:56 PM

Quote:

Originally Posted by Werner Wesp (Post 768820)
Of course it doesn't. Normal, un-supersampled slow mo (supersampling induces blur anyway) asks for deinterlacing first. Leaving 720p50 at 50fps, but making 1080i50 into 540p25. Slowed down that gives respectively 25 and 12,5 fps.

If you want to throw out half your temporal resolution, go right ahead. But that's a choice you're making - and as such it's not a limitation of the format but of your workflow. It's relatively easy to turn 1080i50 into 540p50, so why wouldn't you?

Greg Voevodsky November 11th, 2007 10:28 PM

No motion blur and oversampling... bad example
 
Quote:

Originally Posted by Thomas Smet (Post 773890)
Ok, and sorry I do not agree at all. A lot of people enjoy 720p broadcasts and have never once complained watching shows in HD on ABC or FOX. I'm talking about film source shows as well such as Grey's Anatomy and other drama shows.

Here is a image I made exactly like you said I should test it. I took a Canon D20 still image and cropped it at 1920x1080 pixels. I then down scaled a 720p version and scaled it back up. I faked a 1080i version by adding in a slight softness for reducing the interlace flicker and low pass filtering. Sure maybe true 1080p when watched as true 1080p is sharper but it is not "clearly" better and only tech heads watching it on a computer monitor would ever notice.

This image does not give any numbers at all because I do not use numbers. I use my eyes and my eyes tell me that at the end of the day there is very little difference between 1080i and 720p. If you have a better example please show me because no matter what type of images I start with be it 3D rendered, digital still or actual 1080i still shots from HDV I find always the same to be true. It isn't exactly fair to compare 720p from DVCPROHD tape either since it only uses 960x720 pixels. We are talking true 1280x720 pixels.

Excuse me if I missing something here, but are you not taking an oversampled image. So of course it looks better, just like HD to SD, looks much better than SD to SD.

2nd, a still photo with nothing moving, is a bad example, you should have some motion blur and similar frame rates. 60fps looks very unnatural compared to 24fps, when I saw showscan many years ago - it was overly sharp with no motion blur - the front rung on the wooden rollercoaster was sharp - which it is not in real life to our eyes or at 24fps.

Also, isn't a still image a bit of a joke?! Interlaced looks bad as a still frame compared to a progressive frame but looks fine or better when seen in motion. Same with natural blur looks fine in motion and soft in a still. So I'd say you really need to compare video properly, you need to take still images from its in its native moving video format... in motion and more importantly watch the moving footage side by side.

Adam Reuter November 11th, 2007 10:41 PM

Quote:

Originally Posted by Werner Wesp (Post 768334)
Sometimes I wonder...

How many people actually know that 720p50 is far superior to 1080i50? Or how many even know that 1080i is NOT full HD? And yet - it is just a matter of the easiest mathematics:

50 x 1280 x 720 pps or 50 x 1440 x 540 pps is kinda obvious. (and that's not even taking into account the interlaced filter you need to keep the image from being jittery)

1080i is NOT 540 lines of resolution. I hate how people continue to perpetuate this myth. Although it may be a trick of the eye, 1080i TO HUMANS, not on a sheet of paper or freeze frame, shows a higher resolution image. And real world quality is what matters in the end...right?

Quote:

Originally Posted by Werner Wesp (Post 768334)
Not to mention all HD screens nowadays are progressive, so you can not watch 1080i optimally.

New ones perhaps but most of the HDTVs out there are the older 1080i variety. And what format are broadcasters delivering content in? 1080i! So footage shot 720p gets converted to 1080i and DOES NOT look as good. I've done conversion from 1080i to 720p (60i and 60p) and most people do not notice a quality hit (it is minimal) but try the opposite (720p60 to 1080i60) and a lot of people notice a softening of the image.

I will give you that slo-mo looks better at 720p. So if slow motion is needed, shoot 720p for those parts but for all the rest at 1080! That's what I do...

Also, 1080p televisions deinterlacing 1080i footage looks fantastic! If a TV can do the math on the fly, imagine what software algorithms can accomplish! I'd say shooting 1080i now is "future proofing" for whenever 1080p60 becomes a broadcast format.

I also agree that 1080i is less bit-efficient than 720p. But the difference is negligible.

As far as the camera itself goes, 1/2" chips tend to give better color reproduction, latitude and of course nicer depth of field qualiities. All of this translates to a more professional image compared to the 1/3" chip cameras. I think all future prosumer cameras will HAVE to be 1/2" chip cameras in order to combat the low light problem. It really is a physics problem at this point and Sony is pushing the competition to this arena...which we will all benefit from!

Thomas Smet November 12th, 2007 11:44 AM

Quote:

Originally Posted by Greg Voevodsky (Post 773946)
Excuse me if I missing something here, but are you not taking an oversampled image. So of course it looks better, just like HD to SD, looks much better than SD to SD.

2nd, a still photo with nothing moving, is a bad example, you should have some motion blur and similar frame rates. 60fps looks very unnatural compared to 24fps, when I saw showscan many years ago - it was overly sharp with no motion blur - the front rung on the wooden rollercoaster was sharp - which it is not in real life to our eyes or at 24fps.

Also, isn't a still image a bit of a joke?! Interlaced looks bad as a still frame compared to a progressive frame but looks fine or better when seen in motion. Same with natural blur looks fine in motion and soft in a still. So I'd say you really need to compare video properly, you need to take still images from its in its native moving video format... in motion and more importantly watch the moving footage side by side.

It is only a bad example if you are talking about motion. This test isn't talking about motion but detail. at the end of the day there is very little change in detail between 1080i and 720p. If you want to talk motion then 60p is also better. Sure as a still it is hard to nice the motion of interlaced but when you watch 1080i on a digital display it is playing one field then another and so on. 60p will be much cleaner since there will not be any interlaced artifacts or flickering edges by the details alternating back and forth between fields. Sure you can add more of a low pass filter to blur this but then you really are left with something just as soft as 720p.

I can make a video test as well but the detail is still going to look the same. All it is going to show is the difference between 540 fields runnign 60 times a second and 720 frames running 60 times a second. They both for the most part look the same and have the same exact type of motion and motion blur. One just has alternating fields that could flicker.

Again this is not a 720p vs 1080i debate but showing how the 720p mode from a JVC camera or the EX1 can give very good results. In fact I would even go as far as to say that 1080i and 720p pretty much give you the same thing in the end. One is cleaner while the other may have a few slivers of detail that sneaks through every now and then. There is no "clearly" better format though.

Even when you can have 100% true 1080p there just isn't that much of a huge difference. Sure it is a little more crisp but is it really worth that much extra bandwidth? 1080p 24p vs 720p 24p uses a lot of bandwidth for that tiny edge in sharpness. People get so hung up on detail and sharpness when that is only a tiny part of image quality.

Steven Thomas November 12th, 2007 12:01 PM

Quote:

Originally Posted by Thomas Smet (Post 774228)
Even when you can have 100% true 1080p there just isn't that much of a huge difference. Sure it is a little more crisp but is it really worth that much extra bandwidth?

Well, we'll know soon enough. Fortunately, the EX1's lens should have the resolving ability to answer that question. At least for myself, since the video cameras I've owned did not resolve much more than 700 lines , which is still decent.

I have a feeling 1080 24P and 30P is going to be a thing of beauty coming off these cameras. I'm real interested in the 30P for upcoming event work.
I'll probably throw in some occasional 720 60P uprezzed to 1080 30P for over crank slow-mo.

Thomas Smet November 12th, 2007 12:04 PM

Quote:

Originally Posted by Adam Reuter (Post 773952)
1080i is NOT 540 lines of resolution. I hate how people continue to perpetuate this myth. Although it may be a trick of the eye, 1080i TO HUMANS, not on a sheet of paper or freeze frame, shows a higher resolution image. And real world quality is what matters in the end...right?



New ones perhaps but most of the HDTVs out there are the older 1080i variety. And what format are broadcasters delivering content in? 1080i! So footage shot 720p gets converted to 1080i and DOES NOT look as good. I've done conversion from 1080i to 720p (60i and 60p) and most people do not notice a quality hit (it is minimal) but try the opposite (720p60 to 1080i60) and a lot of people notice a softening of the image.

I will give you that slo-mo looks better at 720p. So if slow motion is needed, shoot 720p for those parts but for all the rest at 1080! That's what I do...

Also, 1080p televisions deinterlacing 1080i footage looks fantastic! If a TV can do the math on the fly, imagine what software algorithms can accomplish! I'd say shooting 1080i now is "future proofing" for whenever 1080p60 becomes a broadcast format.

I also agree that 1080i is less bit-efficient than 720p. But the difference is negligible.

As far as the camera itself goes, 1/2" chips tend to give better color reproduction, latitude and of course nicer depth of field qualiities. All of this translates to a more professional image compared to the 1/3" chip cameras. I think all future prosumer cameras will HAVE to be 1/2" chip cameras in order to combat the low light problem. It really is a physics problem at this point and Sony is pushing the competition to this arena...which we will all benefit from!

Wrong. Not all HD is broadcast as 1080i. That is a myth. ABC, FOX and ESPN are 720p.

Of course 1080i creates an optical illusion of solid frames but that doesn't change the fact that it is still 540 pixel fields alternating. Of course this makes it look like 60p but it can have detail flickering if those details are too small. In order to make sure you do not have details that are 1 pixel thick you have to use low pass filtering which in the end bring the sharpness down. The image I posted shows that. The fact that it is interlaced doesn't reduce the level of detail at all. It is the low pass filtering that reduces the detail of 1080i and the jagged edges that are the problem. If you do not agree with what I have done please do your own test and show me I am wrong. I have yet to see any examples of how 1080i is that much better then 720p.

Evan Donn November 12th, 2007 07:16 PM

Quote:

Originally Posted by Adam Reuter (Post 773952)
1080i is NOT 540 lines of resolution. I hate how people continue to perpetuate this myth. Although it may be a trick of the eye, 1080i TO HUMANS, not on a sheet of paper or freeze frame, shows a higher resolution image. And real world quality is what matters in the end...right?

Actually the passage you quoted was not equating 1080i to 540 lines of resolution - you missed the third number which indicates the samples per second. So saying 50 x 540 x 1440 translates to 50 samples (fields) of 540 horizontal by 1440 vertical lines. A frame, composed of two fields, will have greater than 540 lines of resolution, but less than 1080 due to the low pass filtering, as Thomas mentioned. However each individual field cannot have more than 540 because that's how many horizontal lines there are.

Adam Reuter November 12th, 2007 08:40 PM

Quote:

Originally Posted by Evan Donn (Post 774455)
Actually the passage you quoted was not equating 1080i to 540 lines of resolution - you missed the third number which indicates the samples per second. So saying 50 x 540 x 1440 translates to 50 samples (fields) of 540 horizontal by 1440 vertical lines. A frame, composed of two fields, will have greater than 540 lines of resolution, but less than 1080 due to the low pass filtering, as Thomas mentioned. However each individual field cannot have more than 540 because that's how many horizontal lines there are.

Again, on paper 720p has higher resolution but can humans really discern 60 frames per second? I won't dispute that for sports and high motion work it is best. But for those landscape/vista shots, that's when it counts. And that's why Discovery HD Theatre broadcasts in that format.

When blended together we get 1080 lines. On a good enough TV (Pioneer Elite series or something) 1080 lines definitely look better than 720. Sitting closer to the TV allows you to see an even greater difference. Even on the 55" projection HDTV that I have at home FOX and ABC look softer than NBC and CBS broadcasts. This was BEFORE I found out how they broadcast. Originally I thought Comcast was giving those channels less banwidth but now I know why the picture looks softer. I probably wouldn't notice the difference on a 40" or smaller HDTV though.

Like Jody Elred said, try converting 1080i to 720 and then try the reverse. 1080 yields better resolution. And since there are reports that this camera has 1,000 lines of resolution on the charts users should definitely benefit from shooting in the 1080i format.

But really, perhaps all this debate is null when we look at surveys. Contrast ratio is number 1 for how picture quality is perceived. The 1/2" chip offering should help that a lot. And number 2 is color saturation/accuracy. Hopefully Sony gets these right!

Kevin Shaw November 12th, 2007 11:04 PM

Quote:

Originally Posted by Thomas Smet (Post 774228)
Even when you can have 100% true 1080p there just isn't that much of a huge difference.

Sure there is: full 1080p has 2.25 times the resolution of 720p, which in turn has 2.67 times the resolution of SD video. If you can see the difference between 720p and SD, you'll be able to see the difference between 1080p and 720p on a 1080p display.

And enough of the simulated photo-based examples: either compare footage from actual cameras or wait until someone else gets a chance to do so. Thanks.

John Bosco Jr. November 13th, 2007 05:20 AM

Quote:

Originally Posted by Werner Wesp (Post 768334)
Sometimes I wonder...

How many people actually know that 720p50 is far superior to 1080i50? Or how many even know that 1080i is NOT full HD? And yet - it is just a matter of the easiest mathematics:

Totally wrong. I'm in NTSC land, but the concept is still the same. You cannot measure interlace as progressive because it's not progressive. It splits the frames to achieve the 1080 lines of resolution. The shortcomings of interlace lies in fast motion; that's where progressive shines. But, basically, 1080 is bigger than 720. You can't argue the numbers; however, I wouldn't classify 1080i as far superior, but it is superior.

Oh, by the way, just because a particular codec processes the pixels differently doesn't mean it's not full HD. 1440 x 1080i will view on a monitor as 1920 x 1080i.

1080i is the best we have until the broadcast bandwidth problems are solved, and 1080p becomes the high def standard.

Alex Leith November 13th, 2007 05:43 AM

But 1080i isn't superior... Interlace is a dinosaur from the 1930s when tube technology couldn't scan full-resolution images fast enough to give the perception of a moving image. All new LCD technology is progressive and has to de-interlace interlaced signals to display them... and how well that is done really depends on the set.

Mathematically 1920 x 1080i has the edge over 720p, but 720p has the edge over 1440 x 1080i.

1920 x 540 x 60i = 62.2 million pixels per second
1280 x 720 x 60p = 55.3 million pixels per second
1440 x 540 x 60i = 46.7 million pixels per second
720 x 240 x 60i = 10.4 million pixels per second

However, what does it look like? Personally I think all progressive images look preferable to similar spec interlaced images. Our eyes do not see interlace, and surely the ultimate goal is to match the image to our eyes? Although I find 720p60 a touch unnerving because the motion seems a little too fluid... It takes away the dreamy 25p haze!

It's interesting how it's partly cultural. A higher number of North Amercians seem attached to interlaced than Europeans. I suspect it is to do with the fact that we Europeans have been seeing 25p (or psf) every-time we watch a film on TV for the past 60 years. Whereas there hasn't been any historical broadcasting of 30p (or real 24p) in North Amercia, given that films for 60Hz regions have that awful kack-handed 2:3 pulldown. (see, it's cultural).

John Bosco Jr. November 13th, 2007 05:47 AM

What some of you all don't understand is the networks' decision on which High Definition standard to go with was not too much different than some of our arguments. For example, the U.S. networks CBS and NBC overwhelmingly decided to go with 1080i simply because the top executives could not accept 720p as high definition. ABC and Fox felt that progressive would be better for the majority of their shows which are sports. Although, they were hoping for 1080p; they felt that the faster motion of sports would view better using 720p.

David Heath November 13th, 2007 06:06 AM

Quote:

Originally Posted by John Bosco Jr. (Post 774631)
....... the networks' decision on which High Definition standard to go with was not too much different than some of our arguments.

Timing is also important, with HD broadcasting becoming a reality far earlier in the States than most of the rest of the world (except Japan).

Screens with 1920 horizontal resolution have only become a reality at sensible prices in the last year or so, and same with cameras - for 60Hz standards, HDCAM is 1440 horiz, DVCProHD (1080) is 1280. Only now is 1080 starting to mean 1920, if you like. So the networks arguments may well have gone along the lines of "why go with a format whose undeniable advantage (horizontal resolution) can't really be taken full advantage of in practice?"

Fair point - but things have moved on. HD blue laser discs are full 1080p, and many screens sold now are 1920x1080 - and likely to approach 100% before very long. Differences that once didn't matter now increasingly do, and the danger for ABC and Fox is that more and more people WILL notice differences as screen and camera technology progresses.

But hopefully the current 1080p/25 and 1080i/25 mix will go to 1080p/25(50), when hopefully everybody will be happy.

David Parks November 13th, 2007 07:31 AM

Quote: From ABC executive on decision to go with 720p...

"They usually fail to mention that during the time that 1080i has constructed a single frame of two million pixels, [1/25] second, 720p has constructed two complete frames, which is also about two million pixels. In a given one-second interval, both 1080i and 720p scan out about 60 million pixels. The truth is that, by design, the data rates of the two scanning formats are approximately equal, and 1080i has no genuine advantage in the pixel rate department. In fact, if the horizontal pixel count of 1080i is reduced to 1440, as is done in some encoders to reduce the volume of artifacts generated when compressing 1080i, the 1080i pixel count per second is less than that of 720p".

This 720p vs. 1080i argument has been going on a while now. The truth is that in the late 90's different networks signed contracts with either Sony, in the 1080i camp, or Panasonic, in the 720p camp.

The amazing thing is now the EX does both. Life is good.

Tim Polster November 13th, 2007 08:09 AM

[QUOTE=David Parks;774647]This 720p vs. 1080i argument has been going on a while now. The truth is that in the late 90's different networks signed contracts with either Sony, in the 1080i camp, or Panasonic, in the 720p camp. QUOTE]

Bingo.

When they signed those contracts nobody had HD televisions in their houses and the equipment to produce in HD was such an investment that the only way to convince networks to use it was to have the manufacturers foot most of the bill.

Hence Sony or Panasonic got to choose the format because they were paying for it.

I really don't think pointy-headed pixel peeping (that we like to talk about) had much bearing on the choice for the execs. IMHO

Joel Chappell November 13th, 2007 08:19 AM

With the EX1 capable of both 720p or 1080p and others, does that give it an edge over the 250 in this one area? Or does the JVC claim that 720p is here for the foreseeable future, and the ability to easily "uprender" 720p to 1080i have enough merit to even it up a bit?

David Heath November 13th, 2007 09:41 AM

If you want 24/25fps motion ("film-look") then the EX should have a big edge theoretically because of the 1080p/24(25). It's when you want 50/60Hz motion that the argument gets more complicated. I don't think anyone will argue 720p/25 is superior to 1080p/25, if you're starting with 1920x1080 chips.

Thomas Smet November 13th, 2007 12:04 PM

Quote:

Originally Posted by Kevin Shaw (Post 774541)
Sure there is: full 1080p has 2.25 times the resolution of 720p, which in turn has 2.67 times the resolution of SD video. If you can see the difference between 720p and SD, you'll be able to see the difference between 1080p and 720p on a 1080p display.

And enough of the simulated photo-based examples: either compare footage from actual cameras or wait until someone else gets a chance to do so. Thanks.

At least I try to be honest and back up what I say instead of just saying it with words and convince everybody I am right just because I say so.

Kevin you are thinking with numbers and not with how it looks. Resolution is not everything and the bigger an image gets the harder it is to notice the extra detail. You cannot just say 2.25 times larger and leave it at that because that tells nobody how it will actually look.

Take a 1080i sample and down convert it to 720p. sure you may notice a slight detail loss but not 2.25 times worth of detail is lost. If you do not trust my sample which comes from a higher quality source then if I would have done it with a real 1080i then do it yourself. Sure this is a photo but this test actually gives more of an edge to 1080 since the source is such high quality.

Thomas Smet November 13th, 2007 12:13 PM

Quote:

Originally Posted by David Heath (Post 774706)
If you want 24/25fps motion ("film-look") then the EX should have a big edge theoretically because of the 1080p/24(25). It's when you want 50/60Hz motion that the argument gets more complicated. I don't think anyone will argue 720p/25 is superior to 1080p/25, if you're starting with 1920x1080 chips.

Well that depends. My sample shows that while 1080p may have a edge in deatil it isn't a huge edge like some would like to think.

There is also the fact that if you are shooting 24p/25p you will get much better compression compared to the 1080p flavor. 720p 24p uses 2.5 less frames then 60p so the 35mbits used will result in much higher quality compression. The 1080p sample would need a bitrate higher then 60 mbits/s to get the same compression quality as the 720p sample. So sure maybe the 720p will have less detail but it will have better compression. If you are shooting an action movie I would rather use 720p because it should be very hard to break the codec. 720p also has the option of shooting slow motion segments which is something you can't really do with 1080p. I would much rather my film have all it's shots have the same level of detail and stick with 720p for a consistant look then alternate between the two resolutions.

720p 24p/25p gives an artist a lot more creative options and lower compression at only a slight loss at detail.

Chris Hurd November 13th, 2007 12:15 PM

Quote:

Originally Posted by Thomas Smet (Post 774770)
...you are thinking with numbers and not with how it looks. Resolution is not everything and the bigger an image gets the harder it is to notice the extra detail. You cannot just say 2.25 times larger and leave it at that because that tells nobody how it will actually look.

This is indeed the truth, and it is a fundamental concept that is the mission of this site to impress upon all who visit here. I try to go out of my way to promote this single fundamental understanding to everyone who wanders through this site.

It's never about the numbers, it's only partially about the numbers, and numbers always take a back seat to how the image actually looks; plus there are other factors that need to be considered such as viewing distance for example... there is a point where the viewer's distance from the screen renders moot any difference in "resolution." Thanks Thomas,

John Bosco Jr. November 14th, 2007 05:30 AM

Quote:

Originally Posted by David Heath (Post 774634)
Timing is also important, with HD broadcasting becoming a reality far earlier in the States than most of the rest of the world (except Japan).

Screens with 1920 horizontal resolution have only become a reality at sensible prices in the last year or so, and same with cameras - for 60Hz standards, HDCAM is 1440 horiz, DVCProHD (1080) is 1280. Only now is 1080 starting to mean 1920, if you like. So the networks arguments may well have gone along the lines of "why go with a format whose undeniable advantage (horizontal resolution) can't really be taken full advantage of in practice?"

True. However, broadcast stations knew what they were getting into with HD as Sony and Panasonic demonstrated the two options, 1080i and 720p, on full HD monitors, 1920 x 1080i and 1280 x 720p. In fact, the monitors were CRTs. Panasonic later demonstrated 720p on huge motion picture like video screens and Sony on plasma. However, what was not known at that time was the broadcast of 720p 60, 720p 30 was the standard. I still don't think that would have changed the network's view on which standard to go with.

John Bosco Jr. November 14th, 2007 05:41 AM

Quote:

Originally Posted by Thomas Smet (Post 774776)
Well that depends. My sample shows that while 1080p may have a edge in deatil it isn't a huge edge like some would like to think.

There is also the fact that if you are shooting 24p/25p you will get much better compression compared to the 1080p flavor. 720p 24p uses 2.5 less frames then 60p so the 35mbits used will result in much higher quality compression. The 1080p sample would need a bitrate higher then 60 mbits/s to get the same compression quality as the 720p sample. So sure maybe the 720p will have less detail but it will have better compression. If you are shooting an action movie I would rather use 720p because it should be very hard to break the codec. 720p also has the option of shooting slow motion segments which is something you can't really do with 1080p. I would much rather my film have all it's shots have the same level of detail and stick with 720p for a consistant look then alternate between the two resolutions.

720p 24p/25p gives an artist a lot more creative options and lower compression at only a slight loss at detail.

That is the best argument yet for 720p, and considering that U.S. broadcast TV compresses to 19mb/s using long GOP MPEG 2, compression artifacts will have some impact. Good point.

Of course, I don't know how we got started on 1080i vs 720p as the EX 1 shoots both.
I believe the EX 1 is a better camera than the JVC 250 because mainly of the sensor size and higher bit rate. I feel the 250 is overpriced for what it is.

David Heath November 14th, 2007 05:51 AM

Quote:

Originally Posted by John Bosco Jr. (Post 775175)
True. However, broadcast stations knew what they were getting into with HD as Sony and Panasonic demonstrated the two options, 1080i and 720p, on full HD monitors, 1920 x 1080i and 1280 x 720p.

Maybe, but at the time the decisions were being made, full HD monitors weren't even available for consumers, and that possibility seemed in the very far future. Now they are close to being the norm. Hence, even if a difference was seen, it would be quite understandable for it to be ignored on the basis that it couldn't be noticeable to any consumer. Which until recently has been true.

But I agree with previous posts that too much shouldn't be read into some network decisions, they are not always made for the best technical reasons.

Kevin Shaw November 14th, 2007 08:14 AM

Hey gang, the EX1 will be shipping in a few days and we can start looking at some actual footage to compare that, instead of speculating about it based on technical specs. :-)

Does Blu-ray support 720p playback at 60 fps?

Tom Vaughan November 14th, 2007 07:58 PM

Quote:

Originally Posted by Kevin Shaw (Post 775232)

Does Blu-ray support 720p playback at 60 fps?

Yes, it does... encoded in MPEG-2, VC-1 or AVC (H.264).

Tom

Werner Wesp November 28th, 2007 11:16 AM

uprezzing 1080i50 to 1080p50 of a scene with (a lot of movement) asks for deinterlacing first. Therefore yielding a lower quality 1080p50 compared to 720p50 uprezzed. If the original 1080i footage isn't interlaced first you'd get jagged edges on movement.

Anyhow, I do a lot of people get me wrong. interlacing is a very nifty trick. It solved the original problem of tubes years and years ago for instance. It is just an old fashioned standard nowadays with virtually no benefits, since all flat panel TV's and monitors are natively progressive. They do accept 1080i, but have to deinterlace it to show it on their screen. interlacing is only seen in full quality on CRT's and that's whats in the screen technology natively.

Since virtually all equipment is progressive, codecs are adapted to progressive better and so on, why retain the old standard that was only invented because TV's couldn't handle full images at onces (i.e. progressive)? How many that have a 1080i camcorder actually have a 1080i capable CRT monitor, I wonder...

The mathematics are simple enough, I don't have to repeat them. 720p50 packs over 46,0 million pixels per second - 1080i50 packs theoretically just under 38,8 million pixels per second*. I know what footage I'd like to have on my NLE, especially if I want to do a lot of time-stretching effects.

* in reality it is even less, because if 1080i resolved all 1080 lines, horizontal details of 1 pixel thick would cause severe interlace flicker. To avoid that and keep the image steady, the interlaced smears details out over several lines - resulting in a small gaussian filter effect - working out to a resolution loss of about 30%. More realisticly, 1080i50 packs just over 28,0 millions per second.

All this combined makes it difficult to believe 1080i is still here. Although, why would sony have 1080i only on older models and on recent models also the capability of 720p?

720p and 1080i are both standards that'll be lost soon enough and make way for a 1080p50/60 standard, but in the meantime, I'd suggest shooting in 720p50/60 and if you really want 1080i50/60, you can always DOWNsample your original 720p50/60 to 1080i50/60...
(720p50 to 1080i50 is indeed downsampling 46M to 28M pixels per second)

David Heath November 28th, 2007 11:25 AM

All absolutely true if you need 50 or 60 Hz motion. But for most drama etc, 25Hz motion is chosen for aesthetic reasons, and then we're not talking about 1080i/25, but rather 1080p/25. But yes, the sooner 1080p/50 sweeps both 1080i AND 720p away, the better.

Werner Wesp November 28th, 2007 11:29 AM

Quote:

Originally Posted by David Heath (Post 783527)
All absolutely true if you need 50 or 60 Hz motion. But for most drama etc, 25Hz motion is chosen for aesthetic reasons, and then we're not talking about 1080i/25, but rather 1080p/25. But yes, the sooner 1080p/50 sweeps both 1080i AND 720p away, the better.

True, since 1080p24, 25 and 30 is possible, that is superior to 720p24,25 or 30. What the camera will deliver in those modes remains to be seen, but 1080p the best you'll get if the camcorder can deliver it and the desired frame rate for your purpose is available in that mode.

Kevin Shaw November 28th, 2007 03:04 PM

My understanding is that the EX1 has a 1080p recording mode, so if it maintains full resolution at that setting it should look pretty good compared to the JVC cameras. And there are other reasons to expect the EX1 to be stiff competition, so it's not just about pixel counts and frame rates.

Thomas Smet November 29th, 2007 12:06 AM

I have done a few tests now with motion video. I have created many samples of animated video including simple fast moving objects and scenes that have a lot of 3D rendered hair and fur. In the end with proper filtering the 1080i versions and the 720p both playing back upscaled to 1080p 60p look almost exactly the same. I would share these samples but I cannot think of a way to share 1920x1080p 60p video with you guys. All the tests I have done were in special playback software I wrote so it could run the video at 60p.

I would say the 1080i was about 1% sharper which was kind of an illusion really. The reason why it looks sharper is because of the interlacing. Because the video is interlaced you end up with 1080 unique lines of detail that are all different. With progressive video the lines tend to blend into each other making a more natural looking image. With interlaced however you do not see this blend as much because of how the fields are split. 1080i does work very well for HD and while there can be edge artifacts they do not show up as much as some would think. 1/60th of a second is pretty fast and before you could notice anything missign the missing part is replaced with the next field and so on. The beauty of 1080i however is that even when you have a single field it still looks like a 1080 image. A single field with the lines alternating will look much sharper then a 1920x540 rendered image so you cannot say 1080i only has 1920x540 per field. Sure it has that many pixels but they are spread out better so they look more detailed.

Steve Mullen November 29th, 2007 12:09 AM

Quote:

Originally Posted by Kevin Shaw (Post 783686)
My understanding is that the EX1 has a 1080p recording mode, so if it maintains full resolution at that setting it should look pretty good compared to the JVC cameras. And there are other reasons to expect the EX1 to be stiff competition, so it's not just about pixel counts and frame rates.

It seems lots of folks are throwing around the term "1080p" without realizing that the only two useable frame-rates are 25p -- ONLY in R50 -- and p24 -- ONLY in R60. And, either frame-rate is only useable where low-frame rates are desired.

When smooth and clear motion capture is needed, 1080p is not available. You'll have to shoot 720p50 or 720p60.

So there remains a conflict between fine detail (1080) and clear motion (progressive). The only thing that has changed is that before we had to select Sony or JVC -- while now we can select between formats on the EX1.

Whoever can get 1080p50 and 1080p60 in production -- gets the prize. (I suspect JVC will announce at NAB 2008 as they are committed to progressive.)

What's odd is that the EX1's CMOS chips and DSP are already working at 50Hz/60Hz. It simply isn't recording it. Which is strange because solid-state recording shouldn't be a limiting factor.

Kevin Shaw November 29th, 2007 01:22 AM

Quote:

Originally Posted by Steve Mullen (Post 783941)
When smooth and clear motion capture is needed, 1080p is not available. You'll have to shoot 720p50 or 720p60.

So the EX1 has a 720p60 recording mode like the JVC cameras and a full-raster 1080p24 mode, both using higher bandwidth? Sounds like healthy competition to me...

Werner Wesp November 29th, 2007 05:54 AM

Quote:

Originally Posted by Thomas Smet (Post 783939)
I have done a few tests now with motion video. I have created many samples of animated video including simple fast moving objects and scenes that have a lot of 3D rendered hair and fur. In the end with proper filtering the 1080i versions and the 720p both playing back upscaled to 1080p 60p look almost exactly the same. I would share these samples but I cannot think of a way to share 1920x1080p 60p video with you guys. All the tests I have done were in special playback software I wrote so it could run the video at 60p.

I would say the 1080i was about 1% sharper which was kind of an illusion really. The reason why it looks sharper is because of the interlacing. Because the video is interlaced you end up with 1080 unique lines of detail that are all different. With progressive video the lines tend to blend into each other making a more natural looking image. With interlaced however you do not see this blend as much because of how the fields are split. 1080i does work very well for HD and while there can be edge artifacts they do not show up as much as some would think. 1/60th of a second is pretty fast and before you could notice anything missign the missing part is replaced with the next field and so on. The beauty of 1080i however is that even when you have a single field it still looks like a 1080 image. A single field with the lines alternating will look much sharper then a 1920x540 rendered image so you cannot say 1080i only has 1920x540 per field. Sure it has that many pixels but they are spread out better so they look more detailed.


This is why you can't use animated graphix as base. When you resampled that to 1080i, probably no anti-flickering was added. You haven't checked your 1080i on a 1080i CRT, I suppose.

Furthermore, most motion graphics aren't subject to the compression that is less suited to MPEG compression, since there's no higher order correction for redundant info in between fields.

1080i has just 1920x540 per field (or 1440x540). On a CRT without deinterlacing filter is seems to have 1920/1440x1080 per frame, still 1920/1440x540 per field.

If you really want to test: stitch a few videos in 720p50 together to get 1080p50. Then downsample to 720p50 and to 1080i50. The difference is there. I'd say it is quite something, others don't mind or don't see it - perception is subjective and it need to be said both 720p and 1080i look rather good (especially compared to SD, even PAL).
When you slow down the 1080i and the 720p it is a whole different game: no-one remains that doesn't see the difference.

Thomas Smet November 29th, 2007 04:01 PM

Quote:

Originally Posted by Werner Wesp (Post 784016)
This is why you can't use animated graphix as base. When you resampled that to 1080i, probably no anti-flickering was added. You haven't checked your 1080i on a 1080i CRT, I suppose.

Furthermore, most motion graphics aren't subject to the compression that is less suited to MPEG compression, since there's no higher order correction for redundant info in between fields.

1080i has just 1920x540 per field (or 1440x540). On a CRT without deinterlacing filter is seems to have 1920/1440x1080 per frame, still 1920/1440x540 per field.

If you really want to test: stitch a few videos in 720p50 together to get 1080p50. Then downsample to 720p50 and to 1080i50. The difference is there. I'd say it is quite something, others don't mind or don't see it - perception is subjective and it need to be said both 720p and 1080i look rather good (especially compared to SD, even PAL).
When you slow down the 1080i and the 720p it is a whole different game: no-one remains that doesn't see the difference.

I have tested all of this. My graphics all had filtering in 3 ranges from small to higher. I even tried different ways of filtering the video such as blur filters and even down scaling slightly and scaling back up to simulate lower detail. None of my tests had sharp crisp graphics because I always design my graphics with some level of filtering to reduce DCT based image artifacts when compressed. The graphics from 3D Studio Max used the video antialiasing method which simulates video filtering and is actually very soft in detail. In fact some people may even say I filtered a little too much which is why I also tried this a few times with less filtering.

What I have found is that even if you did start with totally unfiltered video the 1080i version still wasn't really that much more detailed and they both ended up pretty much looking the same in terms of detail although unfiltered the 1080i looked like garbage because it flickered all over the place while the 720p still looked very clean.

I have even stritched together uncompressed live SD captures to create fake HD but this can only go so far because SD cameras tend to filter more then what you need with HD. This is also a almost useless test for motion because you have no way to capture 1080i 60i and 720p 60p at the same time to test the motion. I mostly use this method to test encoders and processing software.

As much as I hate 1080i it really isn't fair to call it 1440x540. If you create a image that is 1440x540 pixels in size and then one that is 1440x540 but with space alternated in between the second one will look sharper. 1080 with every other line duplicated will look sharper then 540 pixels blown up to 1080. It is all about the illusion of detail and that is exactly what it does. It doesn't really work to think of interlaced in terms of pixel size because there are so many optical illusion factors in place that it just doesn't work.

Steve Mullen November 29th, 2007 05:31 PM

Quote:

Originally Posted by Kevin Shaw (Post 783966)
So the EX1 has a 720p60 recording mode like the JVC cameras and a full-raster 1080p24 mode, both using higher bandwidth?

And, a 720p24 mode.

That's why the ball is now in JVC court.

They could do a handheld version of the HD250 and/or push the art with 1080p50 or 1080p60. Europe needs 1080p50 immediately!

Or, give us ProHD 422.

Or, all of the above.

A harddisk will take the higher data-rates needed for these.


All times are GMT -6. The time now is 03:23 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network