DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Canon VIXIA Series AVCHD and HDV Camcorders (https://www.dvinfo.net/forum/canon-vixia-series-avchd-hdv-camcorders/)
-   -   HV20 HDMI Out = 8bit.. chance of 10bit? (https://www.dvinfo.net/forum/canon-vixia-series-avchd-hdv-camcorders/94536-hv20-hdmi-out-8bit-chance-10bit.html)

Peter Moretti January 22nd, 2008 10:20 PM

Has anyone definitively answered Robert's orginial ?. Can the HV-20 output 10-bit color or is it only 8-bit?

Yi Fong Yu January 23rd, 2008 09:21 AM

isn't the cmos 8-bit? if so, even if the rest of the 'chain' is 10-bit, it wouldn't matter since the original sensor is limited @8 anyways. to get true higher color depths u must have it from the very beginning to the very ultimate end (including editing/playback systems) and afaik, 99% of that chain for everyone is all 8-bit.

Mikko Lopponen January 23rd, 2008 02:31 PM

Is there even a 10-bit monitor? :) And what videocard outputs 10-bit anyway?

Peter Moretti January 23rd, 2008 03:37 PM

Yi, I've read differing accounts of the sensor, all seeming credible. Some say 8-bit and some say 12-bit.

Mikko, you are kidding, right?

Ian G. Thompson January 23rd, 2008 09:23 PM

Quote:

Originally Posted by Robert Ducon (Post 720446)
But by the same token, I could set my computer to capture to an uncompressed 10 bit codec of my choice - say, Blackmagic Uncompressed 10bit, or ProRes 422 HQ 10bit.

That's not my point though...

Does the HV20 actually OUTPUT 10bit 1080i LIVE footage via HDMI? That is my only question, and has been from the start.

I don't think anyone knows the answer for certain. But David Newman of Cineform seems to think it's 8-bit and upsampled from 1410x1080 to 1920x1080 out the HDMI output "live." That last part I doubt...but I am leaning on this cam actually outputting 8bits. my 2 cents.

Mikko Lopponen January 24th, 2008 03:37 AM

Quote:

Originally Posted by Peter Moretti (Post 813021)
Mikko, you are kidding, right?

Somewhat. Yes, I know it helps in color correcting blaa blaa blaa, but a full 10-bit pipeline would have to include native 10-bit monitors. An lcd panel can have 10-bit color corrections, but the final output is still 8-bit.

Peter Moretti January 24th, 2008 03:55 AM

Quote:

Originally Posted by Mikko Lopponen (Post 813313)
Somewhat. Yes, I know it helps in color correcting blaa blaa blaa, but a full 10-bit pipeline would have to include native 10-bit monitors. An lcd panel can have 10-bit
color corrections, but the final output is still 8-bit.

I'm just not sure that I'm following you. AFAIK, there are many monitors that display 10-bit and higher color.

Here is an example from Eizo (which while a very nice monitor is not a true stuido or production monitor): http://www.eizo.com/products/graphics/cg241w/spec.asp

Unless I'm mistaken, this monitor displays either 12 or 16 bit color, depending on how you measure it.

Peter Moretti January 24th, 2008 04:05 AM

Quote:

Originally Posted by Ian G. Thompson (Post 813188)
I don't think anyone knows the answer for certain. But David Newman of Cineform seems to think it's 8-bit and upsampled from 1410x1080 to 1920x1080 out the HDMI output "live." That last part I doubt...but I am leaning on this cam actually outputting 8bits. my 2 cents.

So does the HV-20 spit out of its HDMI port the same color sampling and bit depth that the XH-GI spits out of its HD-SDI port? Aren't both most likely 4:2:2 8-bit?

Ian G. Thompson January 24th, 2008 06:44 AM

Quote:

Originally Posted by Peter Moretti (Post 813327)
So does the HV-20 spit out of its HDMI port the same color sampling and bit depth that the XH-GI spits out of its HD-SDI port? Aren't both most likely 4:2:2 8-bit?

As far as I know yes...both 4:2:2 8-bit.

Yi Fong Yu January 24th, 2008 08:34 AM

http://www.usa.canon.com/consumer/co...elTechSpecsAct
the cmos sensor on the HV20 is 1920x1080p24 native. but it doesn't tell what colorspace it is.

imho, it'd be a waste of $ to include 10, 12, 16-bit color sensor only to capture it on tape OR hdmi because most gears/workflow is still 8-bit based.

there is only a handful of HIGH END pro video cards like the latest ati firegl that outputs 10/12+bit color depths. and as noted, displays don't handle that.

i think it is 8-bits. if it is, there'd be no point to try to capture 10 or more bit color on anything.

Ray Bell January 24th, 2008 10:51 AM

Quote:

Originally Posted by Ian G. Thompson (Post 813188)
I don't think anyone knows the answer for certain. But David Newman of Cineform seems to think it's 8-bit and upsampled from 1410x1080 to 1920x1080 out the HDMI output "live." That last part I doubt...but I am leaning on this cam actually outputting 8bits. my 2 cents.


yes, it comes out of the HDMI port at 1920x1080 " Live"....

Robert Ducon January 24th, 2008 02:37 PM

Quote:

Originally Posted by Yi Fong Yu (Post 813399)
there is only a handful of HIGH END pro video cards like the latest ati firegl that outputs 10/12+bit color depths. and as noted, displays don't handle that.

The ATI X1900 XT apparently has 10bit out via DVI, which is pretty cool for a card over a year old and is considered a much lesser model than a FireGL level card.

"...10 bit per channel DVI output."

Sources for the X1900 XT:
http://www.techpowerup.com/reviews/ATI/X1900XT_256M
http://www.shentech.com/3056dh.html

Yes, most displays aren't 10 bit. Shame.

I'm pretty darn sure the HV20 is 8 bit at this point.

Peter Moretti January 24th, 2008 04:44 PM

Robert,

You seem to be leading the way on HDMI capture front. I'm working on a documentary with a lot of indoor interviews. Of course tape would be easier to use, but external capture is a possiblility becasue the camera will be mostly stationary and indoors.

Do you find a significant improvement capturing out of the HDMI port versus using HDV tape? My subjects will not be moving much, which eliminates HDV's motion issues. I'm willing to do live capture but I'm just not sure if I'll see much of a difference.

One thing to keep in mind, is I am hoping to do a film out from the footage. This instinctively makes me say "capture," but with slow moving subjects and not very challenging lighting, is it really worth it?

Thanks VERY MUCH for your input!

Peter Moretti January 24th, 2008 06:09 PM

Quote:

Originally Posted by Ian G. Thompson (Post 813358)
As far as I know yes...both 4:2:2 8-bit.

Do you know if anyone has done a comparison between the two signals? With the superior lens on the G1, I'd expect a better look, but you never know.

Ian G. Thompson January 24th, 2008 06:38 PM

Quote:

Originally Posted by Ray Bell (Post 813474)
yes, it comes out of the HDMI port at 1920x1080 " Live"....

Sorry...I misquoted myself...let me rephrase... It "does" come out of the HDMI port @ 1920x1080 while shooting live. The part that is in question is whether or not that image was downsampled (to 1440x1080) while going through the DSP and then upsampled back to 1920x1080 out the HDMI port...all live. Me personally I don't believe there is any change in image size....but it really does not matter because it produces such a great image anyways. I also believe it's 8bits..simply because it does not support the new HDMI 1.3 which is 8, 10, 12 and 16 bit. The older HDMI format was only 8bit...but I could be wrong...


All times are GMT -6. The time now is 04:45 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network