DV Info Net

DV Info Net (http://www.dvinfo.net/forum/)
-   Canon VIXIA Series AVCHD and HDV Camcorders (http://www.dvinfo.net/forum/canon-vixia-series-avchd-hdv-camcorders/)
-   -   Single CMOS chip vs. 3-CCD... (http://www.dvinfo.net/forum/canon-vixia-series-avchd-hdv-camcorders/75456-single-cmos-chip-vs-3-ccd.html)

Peter Macletis September 13th, 2006 05:12 PM

Single CMOS chip vs. 3-CCD...
 
Single CMOS chip vs. 3-CCCD

I have a questions that pertains to resolution (or perceived resolution). While the HV10 features a full 1920x1080 capture area (2073600 pixels), with the use of a primary RGB filter in front of each pixel, we are getting only a third resolution per color channel (691200 pixels), correct?

On the XL-H1 for example, however there are 3-CCDs with a total capture area of only 1440x1080 (so, less total resolution than on the HV10), but, per each color channel (1555200 pixels for each color), meaning, it has over twice color detail of the HV10.

Is it correct or incorrect to assume, the total color resolution on the XL-H1 is perceivable much better, or is it that our eyes distinguish resolution based on the total amount of effective pixels, or the luminance part of the image?

I presuppose that the HV10, downsampling 1920x1080 to 1440x1080 for HDV recording should have a higher perceivable detail while the XL-H1 has more color detail information per color channel. The only thing I know is that taking still pictures from my Canon EOS-1Ds Mk-II set at 8megapixels recording (downsampling internally 16Megapixels to 8 megapixels), looks infinitely better than any native 8 megapixel camera recording at maximum quality. I mean... it is day and night in every way!!!

In Photography I know how it goes and looks comparing single CMOS chips to single CMOS chips of different native resolutions, but on video, comparing single CMOS chips to 3-CCDs, I am not sure how it goes. If you could tell me how correct or incorrect I am from reality here, I'd be most appreciative so at least in technology terms I understand better how these things play out in the real-world and on the screen.

Thanks :)

Vlad Kosulin September 13th, 2006 09:02 PM

Tha same DIGIC processor
 
My understanding is that for HV10 Canon uses exactly the same color interpolation with DIGIC II they do for DSLR.

Both solutions are not ideal:

CMOS from HV10 uses color interpolation, but contains full 1920x1080 matrix bringing more details to the final HDV frame (more contrast and ). From Canon DSLR experience (all their DSLR also utilize RGB primary color filter) we also know that the color interpolation with DIGIC II is not an issue at all. Also, Canon CMOS produces very low noise in low light;

Many 3CCD camcorders contain sensors with smaller number of pixels, therefore, the picture quality suffers from upscaling to HDV frame size (1440x1080). 3CCD should bring more accurate colors in low light, but additional prism reduces light sensitivity.

Next step from Canon could be to use a single 6MB CMOS with pixel density comparable to their APS-C sensors. Results should be amazing, IMHO.

Vlad Kosulin September 13th, 2006 09:18 PM

Follow Up
 
After re-reading your initial post:
3CCD does not have more color detail information per color channel. It brings more _accurate_ 24 bit color information, because it can read exact numbers for every pixel for every color channel.
1CMOS reads exact number only for 1 color channel and interpolates result for 2 other color channels for every pixel. But having excessive pixels in HV10, this can be done very efficiently. May be, they can read exact numbers for 2 channels, and interpolate only one. We do not know exact sensor and filter patterns they use in HV10, therefore, can only assume.

Peter Macletis September 13th, 2006 10:03 PM

Quote:

Originally Posted by Vlad Kosulin
...Next step from Canon could be to use a single 6MB CMOS with pixel density comparable to their APS-C sensors. Results should be amazing, IMHO.

Yes, I have the same hopes. That would yield sensational results and probably be much cheap to integrate than a 3-CCD array.

Thomas Smet September 13th, 2006 11:34 PM

With HDV the color resolution isn't as big of a deal as the luam resolution since HDV encodes as 4:2:0 color anyways. Even if you had a true 1920x1080 3CCD camera where every single pixel had RGB information you would end up throwing out half of that chroma anyways when it gets put on tape. So in fact the chroma detail might be pretty close to what you would get on tape from a 3CCD camera. It isn't exact because there is still some interpolation going on to fudge the pixels but for the most part you are not throwing out too much more than what HDV is already throwing out.

If you started with a 6MP cmos chip you would still end up throwing out chroma pixels when that was down sampled down to 1440x1080 which leaves you with 720x540 chroma pixels or much less than what that 6MP cmos is giving you. Yes the quality may be a little bit better due to super sampling the image but it wouldn't be light years better

Vlad Kosulin September 14th, 2006 06:58 AM

Quote:

Originally Posted by Thomas Smet
If you started with a 6MP cmos chip you would still end up throwing out chroma pixels when that was down sampled down to 1440x1080 which leaves you with 720x540 chroma pixels or much less than what that 6MP cmos is giving you. Yes the quality may be a little bit better due to super sampling the image but it wouldn't be light years better

My idea was to handle sets of 3 neighboring pixels with RGB filter applied as a single big virtual pixel. This would bring an approximation of a single 2MP sensor with every pixel reading all 3 channels. Some normalization is still required though.

HDV is slowly going away. MPEg4/H.264/AVCHD should be the next prosumer standard if NLE vendors manage to do real time editing in commodity software.
I do not know how long the MiniDV will stay as the mainstream, but solid media storage prices should drop significantly in the next couple of years. Removable hard drives can be an option, also. Both option should let to write full 1920x1080 frame with 4:2:2 color in next 3 to 5 years, IMHO.

What we definitely need, is better batteries for bigger sensors and for H.264 processing.

Chris Hurd September 14th, 2006 08:20 AM

H.264 processing may not be too far away... Canon is a member of the AVCHD format group.

Thomas Smet September 14th, 2006 12:40 PM

Yes but AVCHD is also 4:2:0. If it was a 4:2:2 profile of mpeg-4 it could no longer be called AVCHD because that isn't a part of the specs. Yes that can change but based on what we know right now it isn't going to change.

Mikko Lopponen September 14th, 2006 01:31 PM

Quote:

Originally Posted by Vlad Kosulin
HDV is slowly going away. MPEg4/H.264/AVCHD should be the next prosumer standard if NLE vendors manage to do real time editing in commodity software.

Not really. HDV is way better for professional work.Mpeg2 still looks very good when used with a high bitrate. AVCHD doesn't really give almost any advantage (in high bitrates) according to clips I've seen and it also needs WAY more processing power to even display.

Thomas Smet September 14th, 2006 07:30 PM

Quote:

Originally Posted by Mikko Lopponen
Not really. HDV is way better for professional work.Mpeg2 still looks very good when used with a high bitrate. AVCHD doesn't really give almost any advantage (in high bitrates) according to clips I've seen and it also needs WAY more processing power to even display.


I would have to agree with this to a certain degree. While in theory AVCHD can look better I think it is going to be a long time before hardware encoders in cameras get to that level of quality. I think with the somewhat limited bitrates that are being used you might get equal to or only a tiny fraction better quality than HDV. For the extra strain on the cpu and the thousands you will have to spend on a 16 cpu system to render that stuff I would stick with HDV for now. Even for HD-DVD/Blu-ray disks I think at this point I would rather stick to using mepg-2. While for movies VC1 or AVCHD may look a little bit better it is mostly because they are starting from a really high quality source. With us we are already using either mpeg-2 or AVCHD so encoding as AVCHD isn't going to help you gain any extra quality since it is compressed already. I know I will not be able to fit as much video on the blu-ray disks by going this route but I am not really that concerned about it.

Anyways back to the topic... I don't see where HDV is starting to go away. Canon is coming out with three new HDV cameras in a few months. SONY is coming out with two more and JVC is coming out with the HD200 and HD250. I would say HDV is still with us based on the number of products coming out this fall. Besides SONY has been very firm about AVCHD bieng a consumer only format. Yes they may use some other form of mpeg-4 for pro encoding but as of right now there is nothing we know about in the plans. For the pro level including XDCAM HD, it is mpeg-2 all the way for now.

Vlad Kosulin September 18th, 2006 01:53 PM

Why HDV days will end soon
 
When I said HDV is slowly goes away, I did not mean we wouldn't see any new camcoreders. We still get a huge list of ned DV camcorders every year, and this will not stop soon.
What I mean, is that MiniDV/HDV tape already reached it's limits. To get higher quality, switch to a totally new media is required (nobody is willing to build a new tape system to replace MiniDV). I believe, solid media should become mainstream during next 2 to 3 years. HDD are also a viable option. Not DVD except entry level, for sure.
And new media can easily accomodate better compression MPEG4 gives us.
There is an expectation of fuel cells to be production ready 2008 to solve the current Li-Io battery limitations.
We'll get new batteries, new chips (CMOS and digital processors), new media, and new encoding.

Thomas Smet September 18th, 2006 04:52 PM

Great, you worry about what you can get 3 years from now and I will worry about what I can get today.

Ken Hodson September 18th, 2006 05:08 PM

As far as 3ccd vs. single ccd or cmos cam, here is a link to a Steve Mulen article that should help dispell some of the myths surrounding the actuall differance between the two systems. It's a must read!

http://images.videosystems.com/files...nting%20ccd%22

As far as HDV vs. AVCHD, it is completely bitrate related as far as quality goes. Yes AVCHD has the potential to be far better the HDV at high bit-rates, but this does not appear to be why it was developed. It main design purpose seems to be in encoding SMALL files at a low bit-rate. As we all know Mpeg4 is far more efficient then HDV's Mpeg2. This is especially suitable for solid stated storage (memory sticks, small HDD)
The low bit-rate AVCHD's Mpeg4 is capable of being handled by current cutting edge tech. If AVCHD is to become the HDV killer at high bit-rates, it's beyond the limits of technology if it is to be made affordable. JVC for example had to split the 720p screen of the HD100 in half to enable the processing to be even able to handle that resolution using HDV mpeg2 encoder. Mpeg4 requires a good deal more processing then that. As of yet the bit-rate will not allow AVCHD to become and HDV killer. HDV will be the DV of the HD world for a good 5-6 years yet.
As a side note, PC's have just now hit a level of problem free high-speed, high quality, pain free HDV editing. And this is how long after the HD-1/HD-10 were introduced? Neveremind that Apple still does not have 24/25p HDV enabled in Final Cut yet.
Bottom line is AVCHD is not a HDV replacement, but a DV replacement that allows small capture file size for the new wave of consumer mini-tapeless cams. Wether a HDV killing high bit-rate version will come down the road will have to do with costs.

Uri Blumenthal October 9th, 2006 02:20 PM

Quote:

Originally Posted by Ken Hodson
As far as HDV vs. AVCHD, it is completely bitrate related as far as quality goes. Yes AVCHD has the potential to be far better the HDV at high bit-rates, but this does not appear to be why it was developed. It main design purpose seems to be in encoding SMALL files at a low bit-rate. As we all know Mpeg4 is far more efficient then HDV's Mpeg2.

But the inevitable conclusion from the above is that at pretty much any given bit-rate AVCHD will outclass HDV quality-wise. Precisely because it's a better compression algorithm and can squeeze more information into the same (as HDV) number of bits (just like it can squeeze the same amount of information into smaller number of bits).

Mikko Lopponen October 11th, 2006 01:34 AM

Quote:

Originally Posted by Uri Blumenthal
But the inevitable conclusion from the above is that at pretty much any given bit-rate AVCHD will outclass HDV quality-wise. Precisely because it's a better compression algorithm and can squeeze more information into the same (as HDV) number of bits (just like it can squeeze the same amount of information into smaller number of bits).

It doesn't work that way. MPEG2 is very good at high bitrates and it also doesn't smooth texture as much.

AVCHD is way better at low bitrates. But when going to high bitrates AVCHD (and mpeg4 generally) loses it's advantage because the gop gets smaller and the compression becomes closer to intraframe compression. There's almost nothing to get advantage from.

The same goes to mpeg1. It's still very competitive at high bitrates.


All times are GMT -6. The time now is 05:39 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2018 The Digital Video Information Network