DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   JVC GY-HD Series Camera Systems (https://www.dvinfo.net/forum/jvc-gy-hd-series-camera-systems/)
-   -   WooHooo!! HD250 is here! (https://www.dvinfo.net/forum/jvc-gy-hd-series-camera-systems/91763-woohooo-hd250-here.html)

Jeffrey Butler April 17th, 2007 08:33 PM

WooHooo!! HD250 is here!
 
Well, it's here. I got the HD250 and the 17x5 lens...I barely have the batteries charged, and shot a few minutes on a gray day of some deer up on the Skyline Drive here, but it was really, really windy and cold. So I won't be posting any of that!

So there's an HDV 720/30p codec setting for FCP. Is that better than, say, shooting 720/60p and capturing with the Apple Intermediate Codec? I captured this footage (60) w/ the AIC and before each shot got an 11 sec clip for each clip (useless, btw).

If I shoot 720/60 and capture via FCP, I see the footage is 59.94 and the codec is AIC - that's good, right? Is it cleaner w/ the "built-in" 720/30p codec? I'm just curious. I'll try all this out tomorrow w/ some sunshine...

Justin Ferar April 17th, 2007 10:52 PM

In my personal tests I can see a little bit of extra noise using AIC for 720p60 but the main thing is that timecode is not supported- pretty much a deal breaker. Balckmagic announced a neat little thing called Intensity Pro which we will be using to digitize 720p60 from the deck's HDMI. Unbelievably this thing has component i/o as well- $350. You can use firweire for TC and deck control.

To answer your question 720p30 (HDV preset) would be slightly cleaner than 720p60 as AIC. What I think is more important though is which look you are going for as they are two completely different animals.

Let us know what you conclude! And what are you using for monitoring?

Jeffrey Butler April 18th, 2007 05:32 PM

Quote:

Originally Posted by Justin Ferar (Post 662368)
To answer your question 720p30 (HDV preset) would be slightly cleaner than 720p60 as AIC. What I think is more important though is which look you are going for as they are two completely different animals...And what are you using for monitoring?

Right now I have two options. I have an older JVC 16:9 capable HI-900SU crt. I also have two 23" cinema displays. Since it's all coming of the Mac via FCP over IEEE1394, it's pretty much a Cinema Display decision...

So this 1280x720 is really small! I'm feeling disappointed for some reason (haha). My Sony HC-3 consumer palmcorder shoots 1440x1080i (yeah, one chip) but it stretches to fill my cinema. At 1:1, the HDV from this HD250 is "tiny" and while it does stretch to fill when told to, I don't mind saying that 720 really does "feel smaller" than 1080. Gee. Maybe that's because it is! But there must be a good reason this camera shoots 720...or why 720 is a standard.

And yeah, wow, 30p feels way different than 30i. I like it alright - it's nice to have true progressive frames, though the look is reminiscent of Canon's "frame mode."

Honestly, I don't have any idea what frame rate to use for this up and coming "sales assistance" dvd I'm shooting tomorrow. It's all indoor lighting, and guys talking about counter tops and cabinets. High-end stuff, so maybe the 24p will give it a better look. I was gung-ho for 60p, but that's got enough issues in itself to push me into 24 or 30...plus, after shooting it and converting it, it seems to be, at the moment, counter productive. I did just shoot some 60p of my kids soccer game (again, overcast) so I'll see what I can do with that...

Justin Ferar April 18th, 2007 06:53 PM

Hey Jeff,

If it's a corporate gig you might think hard about shooting it 480p60 (16x9) just to play it safe- which still looks VERY nice on LCD's and Plasmas. Just a thought.

I just hear about a lot of folks who shoot 720p30 and don't realize they have to shoot differently in dealing with the temporal motion.

Anyway, I've been looking at the HD200's footage on the new JVC HD broadcast monitor and while it's not 1080- it looks gorgeous. I'm trying to track down some 1080i footage to compare. My current opinion is that resolution is not everything (though certainly an important factor), which is why we went with JVC (progressive chips) over Canon, Sony, Panasonic.

Okay- render is done so now I have to get back to editing.

Cheers mate.

Stephan Ahonen April 18th, 2007 07:36 PM

Quote:

Originally Posted by Jeffrey Butler (Post 662796)
So this 1280x720 is really small! I'm feeling disappointed for some reason (haha). My Sony HC-3 consumer palmcorder shoots 1440x1080i (yeah, one chip) but it stretches to fill my cinema. At 1:1, the HDV from this HD250 is "tiny" and while it does stretch to fill when told to, I don't mind saying that 720 really does "feel smaller" than 1080. Gee. Maybe that's because it is! But there must be a good reason this camera shoots 720...or why 720 is a standard.

I suggest looking at the images side by side and comparing. I'll bet the JVC will win. It's about more than raw numbers.

The reason we have the 720p standard is because of bandwidth. 720p60 is about the most picture information you can reasonably fit in a broadcast channel without interlacing. JVC was targetting this camera toward broadcasters who would want to transmit video through a microwave link back to the studio, so the bandwidth requirements of the camera are geared toward the ability to fit the picture into a 6 Mhz broadcast channel (about 20Mbits/sec).

Interlacing is basically a hack dating from the 1920s. The technology of the time wasn't capable of the bandwidth requirements of progressive scan except at relatively slow scan speeds that produced annoying flickering to the human eye. So they used interlacing to scan the screen twice as fast. Obviously, we're not limited by 1920s technology anymore, which is why it strikes me as extremely stupid that, given the chance to eliminate all the headaches that come with interlacing, engineers chose to implement it anyway in one of the HD standards.

Werner Wesp April 20th, 2007 04:55 AM

Quote:

Originally Posted by Stephan Ahonen (Post 662864)
Interlacing is basically a hack dating from the 1920s. The technology of the time wasn't capable of the bandwidth requirements of progressive scan except at relatively slow scan speeds that produced annoying flickering to the human eye. So they used interlacing to scan the screen twice as fast. Obviously, we're not limited by 1920s technology anymore, which is why it strikes me as extremely stupid that, given the chance to eliminate all the headaches that come with interlacing, engineers chose to implement it anyway in one of the HD standards.

Thanks Stephan, I couldn't say it any better. I understand why the interlaced standard still stands though: that's because consumers are used to shoot that way, and given a 24p or 25p camera they'd have some serious problems (5ąp and 60p on the other hand won't be a problem) - I almost see 1080i like a consumer-transition-format: 1080 sounds better then 720 and we can shoot the way we always did...

But there is more: what does have the most information?
1080i (25p) = 1440 x 540 (# of horizontal lines per field) x 50 (# of fields) = 38.880.000 (pixels of info per second) - actually it is even 30% less then this to avoid flickering of details of 1 line, i.e. about 27.000.000
720p (50p) = 1280 x 720 x 50 = 46.080.000
720p (25p) = 1280 x 720 x 25 = 23.040.000

Long boring calculations to state the obvious... 720p is as sharp as 1080i, 720p50 has even a lot more info. Ever thought of the fact that virtually no-one can see HD on anything other then a progressive display? Everyone watching the 1080i footage is actually looking at 1440x540 pixels - that's vertically even less then PAL progressive... Not to mention the actual quality of camera-heads - or the efficiency of compressing progressive frames versus interlaced fields (in interframe-compression methods) - ...

Clinging to interlaced methods is a bit like clinging to old standards: there's no technology in a 1920's car that's better then technology in a 2007 car - and that should be kept because it was better - one might just argue that is has a certain style that's gone now (perhaps)...

Stephan Ahonen April 20th, 2007 02:01 PM

Quote:

Originally Posted by Werner Wesp (Post 663737)
But there is more: what does have the most information?
1080i (25p) = 1440 x 540 (# of horizontal lines per field) x 50 (# of fields) = 38.880.000 (pixels of info per second)

Broadcast 1080i is 1920 pixels wide, so a full 1080i stream actually takes just slightly more bandwidth than a 720p stream. You're spot on with your other observations though. A lot of formats (HDV, DVCPRO, HDCAM) try to reduce bandwidth requirements by only recording 1440 pixels across (1280 for DVCPRO at 60i), but over-the-air broadcast, gaming consoles, and consumer HD formats like Blu-Ray and HDDVD all operate at a resolution of 1920x1080. The engineers may have given us interlacing, but at least they kept the pixels square. =D

Quote:

I almost see 1080i like a consumer-transition-format: 1080 sounds better then 720 and we can shoot the way we always did...
Meh, consumers could do just as well with a 720p camcorder as they could with a 1080i camcorder. A 1080 consumer camcorder could run at 30/25 fps anyway if the bandwidth for 60/50p is too much, it's not like most consumers care about silky smooth motion. They'd buy it for the "1080" stamped on the side thinking "It's a bigger number so it's better." Those who know better would look for the camcorders that shoot 60p.

I would have rather seen 1080 be fully progressive as a possible "step up" in the future when more bandwidth is available over the air, with 720p being the de-facto over the air standard. The bandwidth is already there on cable, which since it isn't regulated by the FCC can fit more bandwidth per channel, and like I said before things like game consoles and home theater systems can do 1080 no problem.


All times are GMT -6. The time now is 09:01 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network