XDcamEX vs JVC GY HD250 - Page 4 at DVinfo.net
DV Info Net

Go Back   DV Info Net > Sony XAVC / XDCAM / NXCAM / AVCHD / HDV / DV Camera Systems > Sony XDCAM EX Pro Handhelds
Register FAQ Today's Posts Buyer's Guides

Sony XDCAM EX Pro Handhelds
Sony PXW-Z280, Z190, X180 etc. (going back to EX3 & EX1) recording to SxS flash memory.

Reply
 
Thread Tools Search this Thread
Old November 11th, 2007, 08:56 PM   #46
Major Player
 
Join Date: Feb 2006
Location: Philadelphia
Posts: 795
Quote:
Originally Posted by Werner Wesp View Post
Of course it doesn't. Normal, un-supersampled slow mo (supersampling induces blur anyway) asks for deinterlacing first. Leaving 720p50 at 50fps, but making 1080i50 into 540p25. Slowed down that gives respectively 25 and 12,5 fps.
If you want to throw out half your temporal resolution, go right ahead. But that's a choice you're making - and as such it's not a limitation of the format but of your workflow. It's relatively easy to turn 1080i50 into 540p50, so why wouldn't you?
__________________
My latest short documentary: "Four Pauls: Bring the Hat Back!"
Evan Donn is offline   Reply With Quote
Old November 11th, 2007, 10:28 PM   #47
Regular Crew
 
Join Date: Sep 2006
Location: Portola Valley
Posts: 105
No motion blur and oversampling... bad example

Quote:
Originally Posted by Thomas Smet View Post
Ok, and sorry I do not agree at all. A lot of people enjoy 720p broadcasts and have never once complained watching shows in HD on ABC or FOX. I'm talking about film source shows as well such as Grey's Anatomy and other drama shows.

Here is a image I made exactly like you said I should test it. I took a Canon D20 still image and cropped it at 1920x1080 pixels. I then down scaled a 720p version and scaled it back up. I faked a 1080i version by adding in a slight softness for reducing the interlace flicker and low pass filtering. Sure maybe true 1080p when watched as true 1080p is sharper but it is not "clearly" better and only tech heads watching it on a computer monitor would ever notice.

This image does not give any numbers at all because I do not use numbers. I use my eyes and my eyes tell me that at the end of the day there is very little difference between 1080i and 720p. If you have a better example please show me because no matter what type of images I start with be it 3D rendered, digital still or actual 1080i still shots from HDV I find always the same to be true. It isn't exactly fair to compare 720p from DVCPROHD tape either since it only uses 960x720 pixels. We are talking true 1280x720 pixels.
Excuse me if I missing something here, but are you not taking an oversampled image. So of course it looks better, just like HD to SD, looks much better than SD to SD.

2nd, a still photo with nothing moving, is a bad example, you should have some motion blur and similar frame rates. 60fps looks very unnatural compared to 24fps, when I saw showscan many years ago - it was overly sharp with no motion blur - the front rung on the wooden rollercoaster was sharp - which it is not in real life to our eyes or at 24fps.

Also, isn't a still image a bit of a joke?! Interlaced looks bad as a still frame compared to a progressive frame but looks fine or better when seen in motion. Same with natural blur looks fine in motion and soft in a still. So I'd say you really need to compare video properly, you need to take still images from its in its native moving video format... in motion and more importantly watch the moving footage side by side.

Last edited by Greg Voevodsky; November 11th, 2007 at 10:37 PM. Reason: I want to... add something..
Greg Voevodsky is offline   Reply With Quote
Old November 11th, 2007, 10:41 PM   #48
Major Player
 
Join Date: Nov 2006
Location: Baltimore, Maryland
Posts: 234
Quote:
Originally Posted by Werner Wesp View Post
Sometimes I wonder...

How many people actually know that 720p50 is far superior to 1080i50? Or how many even know that 1080i is NOT full HD? And yet - it is just a matter of the easiest mathematics:

50 x 1280 x 720 pps or 50 x 1440 x 540 pps is kinda obvious. (and that's not even taking into account the interlaced filter you need to keep the image from being jittery)
1080i is NOT 540 lines of resolution. I hate how people continue to perpetuate this myth. Although it may be a trick of the eye, 1080i TO HUMANS, not on a sheet of paper or freeze frame, shows a higher resolution image. And real world quality is what matters in the end...right?

Quote:
Originally Posted by Werner Wesp View Post
Not to mention all HD screens nowadays are progressive, so you can not watch 1080i optimally.
New ones perhaps but most of the HDTVs out there are the older 1080i variety. And what format are broadcasters delivering content in? 1080i! So footage shot 720p gets converted to 1080i and DOES NOT look as good. I've done conversion from 1080i to 720p (60i and 60p) and most people do not notice a quality hit (it is minimal) but try the opposite (720p60 to 1080i60) and a lot of people notice a softening of the image.

I will give you that slo-mo looks better at 720p. So if slow motion is needed, shoot 720p for those parts but for all the rest at 1080! That's what I do...

Also, 1080p televisions deinterlacing 1080i footage looks fantastic! If a TV can do the math on the fly, imagine what software algorithms can accomplish! I'd say shooting 1080i now is "future proofing" for whenever 1080p60 becomes a broadcast format.

I also agree that 1080i is less bit-efficient than 720p. But the difference is negligible.

As far as the camera itself goes, 1/2" chips tend to give better color reproduction, latitude and of course nicer depth of field qualiities. All of this translates to a more professional image compared to the 1/3" chip cameras. I think all future prosumer cameras will HAVE to be 1/2" chip cameras in order to combat the low light problem. It really is a physics problem at this point and Sony is pushing the competition to this arena...which we will all benefit from!
Adam Reuter is offline   Reply With Quote
Old November 12th, 2007, 11:44 AM   #49
Trustee
 
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
Quote:
Originally Posted by Greg Voevodsky View Post
Excuse me if I missing something here, but are you not taking an oversampled image. So of course it looks better, just like HD to SD, looks much better than SD to SD.

2nd, a still photo with nothing moving, is a bad example, you should have some motion blur and similar frame rates. 60fps looks very unnatural compared to 24fps, when I saw showscan many years ago - it was overly sharp with no motion blur - the front rung on the wooden rollercoaster was sharp - which it is not in real life to our eyes or at 24fps.

Also, isn't a still image a bit of a joke?! Interlaced looks bad as a still frame compared to a progressive frame but looks fine or better when seen in motion. Same with natural blur looks fine in motion and soft in a still. So I'd say you really need to compare video properly, you need to take still images from its in its native moving video format... in motion and more importantly watch the moving footage side by side.
It is only a bad example if you are talking about motion. This test isn't talking about motion but detail. at the end of the day there is very little change in detail between 1080i and 720p. If you want to talk motion then 60p is also better. Sure as a still it is hard to nice the motion of interlaced but when you watch 1080i on a digital display it is playing one field then another and so on. 60p will be much cleaner since there will not be any interlaced artifacts or flickering edges by the details alternating back and forth between fields. Sure you can add more of a low pass filter to blur this but then you really are left with something just as soft as 720p.

I can make a video test as well but the detail is still going to look the same. All it is going to show is the difference between 540 fields runnign 60 times a second and 720 frames running 60 times a second. They both for the most part look the same and have the same exact type of motion and motion blur. One just has alternating fields that could flicker.

Again this is not a 720p vs 1080i debate but showing how the 720p mode from a JVC camera or the EX1 can give very good results. In fact I would even go as far as to say that 1080i and 720p pretty much give you the same thing in the end. One is cleaner while the other may have a few slivers of detail that sneaks through every now and then. There is no "clearly" better format though.

Even when you can have 100% true 1080p there just isn't that much of a huge difference. Sure it is a little more crisp but is it really worth that much extra bandwidth? 1080p 24p vs 720p 24p uses a lot of bandwidth for that tiny edge in sharpness. People get so hung up on detail and sharpness when that is only a tiny part of image quality.
Thomas Smet is offline   Reply With Quote
Old November 12th, 2007, 12:01 PM   #50
Trustee
 
Join Date: Sep 2005
Location: Gilbert, AZ
Posts: 1,896
Quote:
Originally Posted by Thomas Smet View Post
Even when you can have 100% true 1080p there just isn't that much of a huge difference. Sure it is a little more crisp but is it really worth that much extra bandwidth?
Well, we'll know soon enough. Fortunately, the EX1's lens should have the resolving ability to answer that question. At least for myself, since the video cameras I've owned did not resolve much more than 700 lines , which is still decent.

I have a feeling 1080 24P and 30P is going to be a thing of beauty coming off these cameras. I'm real interested in the 30P for upcoming event work.
I'll probably throw in some occasional 720 60P uprezzed to 1080 30P for over crank slow-mo.
Steven Thomas is offline   Reply With Quote
Old November 12th, 2007, 12:04 PM   #51
Trustee
 
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
Quote:
Originally Posted by Adam Reuter View Post
1080i is NOT 540 lines of resolution. I hate how people continue to perpetuate this myth. Although it may be a trick of the eye, 1080i TO HUMANS, not on a sheet of paper or freeze frame, shows a higher resolution image. And real world quality is what matters in the end...right?



New ones perhaps but most of the HDTVs out there are the older 1080i variety. And what format are broadcasters delivering content in? 1080i! So footage shot 720p gets converted to 1080i and DOES NOT look as good. I've done conversion from 1080i to 720p (60i and 60p) and most people do not notice a quality hit (it is minimal) but try the opposite (720p60 to 1080i60) and a lot of people notice a softening of the image.

I will give you that slo-mo looks better at 720p. So if slow motion is needed, shoot 720p for those parts but for all the rest at 1080! That's what I do...

Also, 1080p televisions deinterlacing 1080i footage looks fantastic! If a TV can do the math on the fly, imagine what software algorithms can accomplish! I'd say shooting 1080i now is "future proofing" for whenever 1080p60 becomes a broadcast format.

I also agree that 1080i is less bit-efficient than 720p. But the difference is negligible.

As far as the camera itself goes, 1/2" chips tend to give better color reproduction, latitude and of course nicer depth of field qualiities. All of this translates to a more professional image compared to the 1/3" chip cameras. I think all future prosumer cameras will HAVE to be 1/2" chip cameras in order to combat the low light problem. It really is a physics problem at this point and Sony is pushing the competition to this arena...which we will all benefit from!
Wrong. Not all HD is broadcast as 1080i. That is a myth. ABC, FOX and ESPN are 720p.

Of course 1080i creates an optical illusion of solid frames but that doesn't change the fact that it is still 540 pixel fields alternating. Of course this makes it look like 60p but it can have detail flickering if those details are too small. In order to make sure you do not have details that are 1 pixel thick you have to use low pass filtering which in the end bring the sharpness down. The image I posted shows that. The fact that it is interlaced doesn't reduce the level of detail at all. It is the low pass filtering that reduces the detail of 1080i and the jagged edges that are the problem. If you do not agree with what I have done please do your own test and show me I am wrong. I have yet to see any examples of how 1080i is that much better then 720p.
Thomas Smet is offline   Reply With Quote
Old November 12th, 2007, 07:16 PM   #52
Major Player
 
Join Date: Feb 2006
Location: Philadelphia
Posts: 795
Quote:
Originally Posted by Adam Reuter View Post
1080i is NOT 540 lines of resolution. I hate how people continue to perpetuate this myth. Although it may be a trick of the eye, 1080i TO HUMANS, not on a sheet of paper or freeze frame, shows a higher resolution image. And real world quality is what matters in the end...right?
Actually the passage you quoted was not equating 1080i to 540 lines of resolution - you missed the third number which indicates the samples per second. So saying 50 x 540 x 1440 translates to 50 samples (fields) of 540 horizontal by 1440 vertical lines. A frame, composed of two fields, will have greater than 540 lines of resolution, but less than 1080 due to the low pass filtering, as Thomas mentioned. However each individual field cannot have more than 540 because that's how many horizontal lines there are.
__________________
My latest short documentary: "Four Pauls: Bring the Hat Back!"
Evan Donn is offline   Reply With Quote
Old November 12th, 2007, 08:40 PM   #53
Major Player
 
Join Date: Nov 2006
Location: Baltimore, Maryland
Posts: 234
Quote:
Originally Posted by Evan Donn View Post
Actually the passage you quoted was not equating 1080i to 540 lines of resolution - you missed the third number which indicates the samples per second. So saying 50 x 540 x 1440 translates to 50 samples (fields) of 540 horizontal by 1440 vertical lines. A frame, composed of two fields, will have greater than 540 lines of resolution, but less than 1080 due to the low pass filtering, as Thomas mentioned. However each individual field cannot have more than 540 because that's how many horizontal lines there are.
Again, on paper 720p has higher resolution but can humans really discern 60 frames per second? I won't dispute that for sports and high motion work it is best. But for those landscape/vista shots, that's when it counts. And that's why Discovery HD Theatre broadcasts in that format.

When blended together we get 1080 lines. On a good enough TV (Pioneer Elite series or something) 1080 lines definitely look better than 720. Sitting closer to the TV allows you to see an even greater difference. Even on the 55" projection HDTV that I have at home FOX and ABC look softer than NBC and CBS broadcasts. This was BEFORE I found out how they broadcast. Originally I thought Comcast was giving those channels less banwidth but now I know why the picture looks softer. I probably wouldn't notice the difference on a 40" or smaller HDTV though.

Like Jody Elred said, try converting 1080i to 720 and then try the reverse. 1080 yields better resolution. And since there are reports that this camera has 1,000 lines of resolution on the charts users should definitely benefit from shooting in the 1080i format.

But really, perhaps all this debate is null when we look at surveys. Contrast ratio is number 1 for how picture quality is perceived. The 1/2" chip offering should help that a lot. And number 2 is color saturation/accuracy. Hopefully Sony gets these right!
Adam Reuter is offline   Reply With Quote
Old November 12th, 2007, 11:04 PM   #54
Inner Circle
 
Join Date: Sep 2004
Location: Sacramento, CA
Posts: 2,488
Quote:
Originally Posted by Thomas Smet View Post
Even when you can have 100% true 1080p there just isn't that much of a huge difference.
Sure there is: full 1080p has 2.25 times the resolution of 720p, which in turn has 2.67 times the resolution of SD video. If you can see the difference between 720p and SD, you'll be able to see the difference between 1080p and 720p on a 1080p display.

And enough of the simulated photo-based examples: either compare footage from actual cameras or wait until someone else gets a chance to do so. Thanks.
Kevin Shaw is offline   Reply With Quote
Old November 13th, 2007, 05:20 AM   #55
Major Player
 
Join Date: Jan 2007
Location: Pinellas Park
Posts: 232
Quote:
Originally Posted by Werner Wesp View Post
Sometimes I wonder...

How many people actually know that 720p50 is far superior to 1080i50? Or how many even know that 1080i is NOT full HD? And yet - it is just a matter of the easiest mathematics:
Totally wrong. I'm in NTSC land, but the concept is still the same. You cannot measure interlace as progressive because it's not progressive. It splits the frames to achieve the 1080 lines of resolution. The shortcomings of interlace lies in fast motion; that's where progressive shines. But, basically, 1080 is bigger than 720. You can't argue the numbers; however, I wouldn't classify 1080i as far superior, but it is superior.

Oh, by the way, just because a particular codec processes the pixels differently doesn't mean it's not full HD. 1440 x 1080i will view on a monitor as 1920 x 1080i.

1080i is the best we have until the broadcast bandwidth problems are solved, and 1080p becomes the high def standard.
John Bosco Jr. is offline   Reply With Quote
Old November 13th, 2007, 05:43 AM   #56
Major Player
 
Join Date: Mar 2005
Location: Northampton, England
Posts: 500
But 1080i isn't superior... Interlace is a dinosaur from the 1930s when tube technology couldn't scan full-resolution images fast enough to give the perception of a moving image. All new LCD technology is progressive and has to de-interlace interlaced signals to display them... and how well that is done really depends on the set.

Mathematically 1920 x 1080i has the edge over 720p, but 720p has the edge over 1440 x 1080i.

1920 x 540 x 60i = 62.2 million pixels per second
1280 x 720 x 60p = 55.3 million pixels per second
1440 x 540 x 60i = 46.7 million pixels per second
720 x 240 x 60i = 10.4 million pixels per second

However, what does it look like? Personally I think all progressive images look preferable to similar spec interlaced images. Our eyes do not see interlace, and surely the ultimate goal is to match the image to our eyes? Although I find 720p60 a touch unnerving because the motion seems a little too fluid... It takes away the dreamy 25p haze!

It's interesting how it's partly cultural. A higher number of North Amercians seem attached to interlaced than Europeans. I suspect it is to do with the fact that we Europeans have been seeing 25p (or psf) every-time we watch a film on TV for the past 60 years. Whereas there hasn't been any historical broadcasting of 30p (or real 24p) in North Amercia, given that films for 60Hz regions have that awful kack-handed 2:3 pulldown. (see, it's cultural).
__________________
Alex
Alex Leith is offline   Reply With Quote
Old November 13th, 2007, 05:47 AM   #57
Major Player
 
Join Date: Jan 2007
Location: Pinellas Park
Posts: 232
What some of you all don't understand is the networks' decision on which High Definition standard to go with was not too much different than some of our arguments. For example, the U.S. networks CBS and NBC overwhelmingly decided to go with 1080i simply because the top executives could not accept 720p as high definition. ABC and Fox felt that progressive would be better for the majority of their shows which are sports. Although, they were hoping for 1080p; they felt that the faster motion of sports would view better using 720p.
John Bosco Jr. is offline   Reply With Quote
Old November 13th, 2007, 06:06 AM   #58
Inner Circle
 
Join Date: Jan 2006
Posts: 2,699
Quote:
Originally Posted by John Bosco Jr. View Post
....... the networks' decision on which High Definition standard to go with was not too much different than some of our arguments.
Timing is also important, with HD broadcasting becoming a reality far earlier in the States than most of the rest of the world (except Japan).

Screens with 1920 horizontal resolution have only become a reality at sensible prices in the last year or so, and same with cameras - for 60Hz standards, HDCAM is 1440 horiz, DVCProHD (1080) is 1280. Only now is 1080 starting to mean 1920, if you like. So the networks arguments may well have gone along the lines of "why go with a format whose undeniable advantage (horizontal resolution) can't really be taken full advantage of in practice?"

Fair point - but things have moved on. HD blue laser discs are full 1080p, and many screens sold now are 1920x1080 - and likely to approach 100% before very long. Differences that once didn't matter now increasingly do, and the danger for ABC and Fox is that more and more people WILL notice differences as screen and camera technology progresses.

But hopefully the current 1080p/25 and 1080i/25 mix will go to 1080p/25(50), when hopefully everybody will be happy.
David Heath is offline   Reply With Quote
Old November 13th, 2007, 07:31 AM   #59
Major Player
 
Join Date: Feb 2006
Location: Houston
Posts: 789
Quote: From ABC executive on decision to go with 720p...

"They usually fail to mention that during the time that 1080i has constructed a single frame of two million pixels, [1/25] second, 720p has constructed two complete frames, which is also about two million pixels. In a given one-second interval, both 1080i and 720p scan out about 60 million pixels. The truth is that, by design, the data rates of the two scanning formats are approximately equal, and 1080i has no genuine advantage in the pixel rate department. In fact, if the horizontal pixel count of 1080i is reduced to 1440, as is done in some encoders to reduce the volume of artifacts generated when compressing 1080i, the 1080i pixel count per second is less than that of 720p".

This 720p vs. 1080i argument has been going on a while now. The truth is that in the late 90's different networks signed contracts with either Sony, in the 1080i camp, or Panasonic, in the 720p camp.

The amazing thing is now the EX does both. Life is good.
__________________
David Parks: DP/Editor: Jacobs Aerospace at NASA Johnson Space Center
https://www.youtube.com/user/JacobsESCG
David Parks is offline   Reply With Quote
Old November 13th, 2007, 08:09 AM   #60
Inner Circle
 
Join Date: Dec 2004
Location: Arlington, TX
Posts: 2,231
[QUOTE=David Parks;774647]This 720p vs. 1080i argument has been going on a while now. The truth is that in the late 90's different networks signed contracts with either Sony, in the 1080i camp, or Panasonic, in the 720p camp. QUOTE]

Bingo.

When they signed those contracts nobody had HD televisions in their houses and the equipment to produce in HD was such an investment that the only way to convince networks to use it was to have the manufacturers foot most of the bill.

Hence Sony or Panasonic got to choose the format because they were paying for it.

I really don't think pointy-headed pixel peeping (that we like to talk about) had much bearing on the choice for the execs. IMHO
Tim Polster is offline   Reply
Reply

DV Info Net refers all where-to-buy and where-to-rent questions exclusively to these trusted full line dealers and rental houses...

B&H Photo Video
(866) 521-7381
New York, NY USA

Scan Computers Int. Ltd.
+44 0871-472-4747
Bolton, Lancashire UK


DV Info Net also encourages you to support local businesses and buy from an authorized dealer in your neighborhood.
  You are here: DV Info Net > Sony XAVC / XDCAM / NXCAM / AVCHD / HDV / DV Camera Systems > Sony XDCAM EX Pro Handhelds


 



All times are GMT -6. The time now is 02:03 PM.


DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network