DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   JVC GY-HD Series Camera Systems (https://www.dvinfo.net/forum/jvc-gy-hd-series-camera-systems/)
-   -   Can you shoot uncompressed? (https://www.dvinfo.net/forum/jvc-gy-hd-series-camera-systems/102644-can-you-shoot-uncompressed.html)

Jim Andrada September 3rd, 2007 04:42 PM

Bill,

Here's the link to the post by Doug Harvey showing the result of analog capture by Blackmagic studio card.

http://www.dvinfo.net/conf/showthrea...988#post736988

Not very exciting! I think I'd go with Firewire.

Steven Thomas September 3rd, 2007 06:59 PM

I hear you.

I fired a message off to blackmagic regarding the analog component issue brought up in that thread and received no response. I asked if their intensity card offered a higher quality A/D conversion. Of course, it seems hard to believe it would since the Studio card costs 3X the Intensity Pro!

I might experiment with my JVC HD100 and buy a component to HDMI and then buy the cheaper Intensity HDMI only card. I plan on using Cineform NEO HDV to capture.
http://www.cineform.com/products/Tec.../Intensity.htm

I'm not sure why the Blackmagic Studio card component input is bad as mentioned here. I don't believe the problem is the D/A component out of the JVC HD100 since others have used this connection and captured via AJA component>SDI

Claude Mangold September 11th, 2007 09:02 AM

Hi all, I just got the info from JVC Switzerland's sales manager : the 251 can output any 720p signal uncompressed via HD-SDI at 100mbit/sec: 24p, 25p, etc.
For digital cinematography this is absolutely super.
BTW, he also told me about a recent direct comparison between a Viper's HDSDI output and the 251's - and claims there is almost no difference, even though the Viper has 4:4:4 . Hard to believe !

Terry VerHaar September 11th, 2007 02:34 PM

Quote:

Originally Posted by Jim Andrada (Post 738563)
For example, as I understand it, the chroma information is not really sampled for every pixel in the first place, so I wonder where the color space "compression" to 4,2,0 happens. It would seem most logical to do this at the front end of the process.

We do know that anything recorded to tape is already compressed, so it would have to be decompressed before being output from the camera.

Not sure why it would seem logical but that isn't what happens. The signal is captured and converted (AD) before it is compressed to the HDV standard. The need for compression is really all about being able to record to mini-DV tape which became the standard when DV was invented. The 4:2:0 color space is a direct result of the way the signal is down-sampled during the compression. Be clear that it is a result of the HDV compression, not the original sampling when the signal is read from the chip and digitized.

Quote:

Originally Posted by Jim Andrada (Post 738563)
This means that there would have to be two separate paths through the camera and it would be different for live capture and for playback

And indeed there are. Three, in fact, for the HD-250! For the 110/200 there is the analog signal that gets sent out via component (or composite) and there is the converted/digitized signal that gets recorded to tape and sent out the Firewire.

In the 250 there is also a route whereby the digitized signal gets sent to the HD-SDI port BEFORE it is compressed to the HDV standard. Because it is a result of the AD conversion but NOT the compression, it contains a 4:2:2 color space rather than the 4:2:0 inherent in the HDV compression.

Quote:

Originally Posted by Jim Andrada (Post 738563)
Somehow I wonder if the in camera process might not be to take the images off the sensors and compress them directly to a memory buffer. Then a Tape Write process could look at the contents of the buffer and re-arrange it as needed (inserting error correcting codes etc) and write it to tape. An Analog Out process could be looking at the same buffer and just doing whatever was necessary to put it on the analog out lines, and other output processes (including SDI) could work the same way. It would be very neat because there wouldn't be any inter process communication or alternate data flow or pathing logic, just a bunch of independent converters sucking off the same buffer and outputting the converted data.

I can't say I understand completely what you are suggesting but it seems like exactly what happens now - sans the buffers, perhaps. Analog signal taken from the chip and sent directly to YUV out. In the 250, the digital signal is tapped post-AD and prior to compression and sent to HD-SDI. Also, in all three camersas, signal is taken post-AD and compressed to HDV, then sent to tape and out FW.

Quote:

Originally Posted by Jim Andrada (Post 738563)

It would also be simple for the playback process because the tape input process could just drop the data from tape into the same buffers, and the back-end processes would just continue to look at the buffer and do their things. In other words, in this model, none of the output processes would care how the data got to the buffer, or whether it was live or playback, they'd just act like the display adapters in a PC and shove whatever showed up in the buffer our their respective ports in appropriate formats. It would also mean that the whole set of front end processes like sampling and compression and tape read could be completely common for each camera in the line-up, whether it had SDI or not. And in fact, it would be easy to offer a variety of ouput processes because they could also be common from camera to camera.

One thing to note is that the uncompressed signals are only available through the YUV and HD-SDI ports AS they are being captured (i.e. LIVE) - NOT from tape after they are recorded. Once the signal is compressede to mini-DV tape in the camera, it is forever HDV at 4:2:0.

Quote:

Originally Posted by Jim Andrada (Post 738563)
I know I worry about this too much! It just bothers me to think that there are two different ways for an image to get to the ouput bus depending on whether it's live or from tape. Possible of course!

Don't worry. Be happy! :-)

Jim Andrada September 11th, 2007 04:26 PM

Terry,

I hope it works the way you posit - it would sure be nice to have an analog stream to convert to digital

But I'm still bothered by a naggging little voice that mutters things like "read out" from the CCD.

In my theory of operation (speculation of operation if you prefer) "reading out" from a CCD is already something one does in the digital domain just like one "reads out" from a memory device.

I think what I'm getting at is that I don't believe there is any inherent "output" from the CCD in the real time analog domain. I think the CCD just sits there accumulating charge on each of its tiny little pixels until somethng happens - that something being a read out" under control of some analog but digitally controlled process which has to interpret the value of each pixel from the charge.

The key here would be that understanding the layout and what pixel is where and what a given charge level means (or more acurately a range of charge levels) and mapping the pixels in some fashion doesn't seem like it would be an inherently analog process. Of course the actual electrical implementation of "read out" is analog, but I think without some higher level of control (typically in the digital domain) the little CCD's would happily all sit there until they saturated 100% and would never output an analog signal.

I certainly wish I knew an engineer at JVC who could set me straight (right or wrong - makes little difference as I think the value of all this is getting us to think more about what goes on inside these little video computers - oops, I meant cameras! Or did I.)


All times are GMT -6. The time now is 08:17 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network