DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Canon EOS Full Frame for HD (https://www.dvinfo.net/forum/canon-eos-full-frame-hd/)
-   -   Why I think the 5D is using BT.601 (https://www.dvinfo.net/forum/canon-eos-full-frame-hd/144706-why-i-think-5d-using-bt-601-a.html)

Thane Brooker February 26th, 2009 08:07 PM

Why I think the 5D is using BT.601
 
The general concensus seems to be that I should use 709 when converting 5D files to RGB because the format is HD. But in this post (entry #34), Keith shows the color parameters atoms in the QuickTime file as follows:

primaries = 1 (ITU-R BT.709-2)
transferFunction = 1 (ITU-R BT.709-2)
matrix = 6 (ITU-R BT.601-4)

I've just had a quick flick through the QuickTime file specification guide, and this suggests the data is encoded from RGB to YUV using BT.601.

Don't be confused by the references to 709: 'primaries' indicates the RGB colorspace is sRGB (= BT.709-2) and 'transferFunction' indicates a gamma 2.2 curve with a linear segment in the lower range (= BT.709-2).

So, the question is, should I trust the Quicktime matrix value, or do I assume Quicktime is wrong and decode with BT.709 because it is a HD recording?

One way to test (other than phoning Canon) would be to shoot a color card in HD mode, then shoot under exactly the same conditions using the SD recording mode. If the theory HD:=709 and SD:=601 is true, then the camera should change colorspace to 601 when recording in SD mode, and the change would be visible when playing back the two files. If it doesn't, then the camera is using a non-conventional matrix for one of its recording modes (either 601 for HD or 709 for SD). I haven't tried this, I'm hoping somebody will know for sure and I won't need to go to this effort.

Jay Bloomfield February 26th, 2009 09:00 PM

Try this yourself. Go in with a hex editor and change the matrix value from 6 to 1. I can't see any difference in any scopes, by doing so, but maybe you'll have better luck.

I think it really depends on what a specific decoder does with the 3 values in the nclc atom.

Thane Brooker February 26th, 2009 09:08 PM

Quote:

Originally Posted by Jay Bloomfield (Post 1019152)
Try this yourself. Go in with a hex editor and change the matrix value from 6 to 1. I can't see any difference in any scopes, by doing so, but maybe you'll have better luck.

I think it really depends on what a specific decoder does with the 3 values in the nclc atom.

Changing the matrix will only alter the way the decoder converts to RGB, assuming the decoder bothers to check. Premiere Pro doesn't, and others simply look at the resolution and base the decision on that. My question is, does the Canon 5D's DIGIC processor use 601 or 709 when encoding? If I knew that, I'd know what to tell my decoder to use (either by changing the matrix, or using an over-ride, or some other method).

Jay Bloomfield February 27th, 2009 03:21 PM

This is little off topic, but have you played around with the most recent build of the Cineform NEO Scene demo? I have NEO HD, so I haven't tried to use NEO Scene with 5D2 MOV files. NEO Scene comes with a built-in h.264 decoder, which is licensed by Cineform from MainConcept and is a newer version the decoders used in Premiere and Vegas 8 Pro.

Chris Barcellos February 27th, 2009 03:56 PM

2 Attachment(s)
Guys, I don't understand all of your numbers, etc, but I have been testing the 5D2 with Neo Scene and and Quicktime. I noted a couple of things.

1. the Cineform and quicktime version are out of synch on sound.

2. On the Vegas time line, they each preview different, with Quicktime on the time line showing the more shadow and highlight detail. (I must confess I updated quicktime at some point here, but I can't recall if it was before I transcode or after.) I am attaching photo screen shots snipped form Vegas.

Jon Fairhurst February 27th, 2009 04:26 PM

Chris,

When using the Vegas scopes, make sure that you set the preview at Full/Best and 1920x1080. Otherwise it smears the histogram. With the full resolution, you will see gaps and bumps in the QT histogram.

BTW, that's definitely QT7.6. It doesn't clip the blacks and whites, and it boosts the mid tones.

Thane Brooker March 19th, 2009 04:36 AM

Has anybody else reached a conclusion on this?

I logged a support incident with Canon, that was two weeks ago and although they have sent a number of replies "we're looking into it", haven't had any answer yet.

So I setup the following experiment:

1) Calibrate an NEC Spectraview to sRGB mode, with video card LUT set to default. This is as close as I can get to displaying video on my PC as true as possible.
2) Set camera on tripod.
3) Manually set White Balance on camera to 6500K.
4) Video a Gretag Macbeth color checker chart under 6500K, 98RDI lighting using Standard, Neutral and Faithful settings. Set ISO 100 and note shutter and aperture.
5) Photograph exactly the same using ISO 100, same aperture and equivalent shutter. Convert to sRGB.
6) Compare Videos to Photographs, ensuring photo viewing application is not doing any monitor profile correction.

601 is almost identical to the photographs. 709, reds turn to orange.

Am I the only one that things 5D files are encoded in 601, not 709? Has anybody else done color comparisons between video and picture?

Jay Bloomfield March 19th, 2009 12:58 PM

The one variable that needs to be controlled in any experiment, is what h.264 decoder is being used to display the file on the monitor. If the decoder reads the file metadata and uses it correctly, then you will get one result. If it ignores it, you will a different result. I posted elsewhere that the new version of the CoreAVC h.264 decoder (1.95) now allows you to choose either 601, 709 or read the metadata header information ("auto detect").

Thane Brooker March 19th, 2009 04:36 PM

Quote:

Originally Posted by Jay Bloomfield (Post 1030352)
The one variable that needs to be controlled in any experiment, is what h.264 decoder is being used to display the file on the monitor. If the decoder reads the file metadata and uses it correctly, then you will get one result. If it ignores it, you will a different result. I posted elsewhere that the new version of the CoreAVC h.264 decoder (1.95) now allows you to choose either 601, 709 or read the metadata header information ("auto detect").

Thanks for the clarification. What I should have said was compare the photograph to video converted using 601, and then compare the photograph to video converted using 709. You are right that you need to use a method where you have full control over the standard used.

For the record, I used three methods to convert from YUV to RGB:

1) FFDShow to convert to RGB, sending RGB directly to EVR. FFDShow is switchable.
2) CoreAVC, sending YUV direct to Haali. Haali is switchable.
3) AVIsynth to tweak YUV levels from 601 to 709, then brought into Premiere Pro as 601 (I documented the procedure for this in another thread).

All 3 results matched as expected., so I know my method is good. My tests conclude 601 is accurate, 709 is not. This is contrary to the general consensus that one should use 709 "because it is HD". As Canon seem unable to confirm either way, I wanted to know if anybody else agrees that 601 is the more accurate, and therefore correct standard to use.

Jon Fairhurst March 19th, 2009 06:14 PM

One clue is this: when you use QT to rewrap the MOV file as an MP4 and open in Vegas, the RGB histogram is perfectly smooth - there are no gaps or bumps. That confirms that we are getting all of the levels from the camera properly. Any other 8-bit interpretation loses information.

Given this starting point, we can then re-color to taste. If you use 32-bit processing in your project, you will retain the quality.

IMHO, 709 vs. 601 isn't such a big deal on the capture side. Just make the decision to get all the data you can and grade to taste while matching scenes. It's more art than science.

On delivery, it's important to get the color space correct. Aside from monitor variations, that's what your audience will see. On the capture side? Don't sweat it - just make sure that you get all the data that the camera can deliver.

Thane Brooker March 19th, 2009 07:21 PM

Jon, I agree for most projects 601/709 on the capture side isn't important. As you say, as long as all the 'bits' are properly in the NLE, one can do what is necessary to make the footage look good. From an artistic point of view, whether you start with "technically correct" colours or "technically incorrect" colours, if you're going to re-grade by eye in 32-bit/10-bit space it doesn't really matter, just tweak the colours and don't worry about the actual numbers. But for some workflows where colour accuracy is more important than looks, starting with technically correct colours is easier and requires less correction in post.

The actual reason I raised this point is not because I'm having problems editing or creating output, but because I'm documenting a workflow. I need to be technically accurate, so if the DIGIC 4 processor in the 5D is encoding RGB data to YUV using standard 601, my workflow needs to state the .MOV files should be decoded back to RGB using the same standard.

I guess most people would prefer to setup their H.264 decoder correctly, even if they were going to regrade in post. And I know for some personalities, doing a transcode or initial playback with the wrong setting ticked would be sacrilege!

Jon Fairhurst March 20th, 2009 12:31 AM

I can relate, Thane. Sometimes getting it technically correct is paramount. As a matter of fact, I produced a ten minute video loop on Blu-ray that is the IEC international standard for measuring television power consumption. Another member of the project team had used SD versions of the content and encoded a DVD with a cheap encoder. The standards committee voted to approve that rough version. I then acquired the HD footage and the EDL and had to reverse engineer the thing. I had content of all stripes (1080i, 720p, 480p, 480i, 50 Hz, 60 Hz...), and the differences between the HD versions and the SD source was staggering. And it all had to meet our target APL' (average picture level) histogram.

During the development, I created 601 to 709 and 709 to 601 matrices for Vegas. Truth be told, the problems were never that simple. On a number of clips, I had to hand roll conversions that were close, but never perfect. It was within the margin of error though, and approved. The standard was published in October last year, Energy Star started using it on November 1st, and the EU and Australia are in the process of adopting it. So far, there have been no technical complaints. (Yay!)

IEC Webstore | Publication detail > IEC 62087-BD Ed. 2.0 English (No, I don't profit from it...)

So yeah, for creative work, just make sure you get all the bits out of the camera, work in 10 bits per color or better, and grade to taste. You'll have fewer gray hairs that way. But sometimes, you've just got to get it technically perfect.

Speaking of which... did you know that when you encode simple test patterns, such as colorbars, with MPEG-2, you get offsets of one or two bit levels? Different encoders and decoders give different errors. I've been told by experts that one of the things they fixed when developing MPEG-4 is that it's bit accurate for flat levels - or at least repeatable from decoder to decoder.

Thane Brooker March 20th, 2009 12:56 AM

Quote:

Originally Posted by Jon Fairhurst (Post 1030603)
I can relate, Thane. Sometimes getting it technically correct is paramount.

Of course I knew you'd relate, I've seen the (in)famous "5D shutter exposed" piece you and Nathan did. Now in my opinion <i>that</i> is a work of art!

Quote:

Originally Posted by Jon Fairhurst (Post 1030603)
Speaking of which... did you know that when you encode simple test patterns, such as colorbars, with MPEG-2, you get offsets of one or two bit levels? Different encoders and decoders give different errors.

*Sigh* And stuff like that certainly gives me gray hairs. Quicktime displays a different default gamma to everything else. I wanted to work out exactly what the difference was, but two different effects in Premiere Pro gave ever so slightly differently results, even though I was typing in the same exact gamma difference. Then you've got Haali Video Renderer giving a slightly different color cast to EVR. And then there are the different way(s) RAW converters may or may not use absolute, relative, perceptual or some other proprietary color shift to avoid clipping saturated colors in the downsample to sRGB, and CoreAVC and FFDShow outputting different frame sizes for no apparent reason. And... and... and...

So all these seemingly 'impossible' variances, which shouldn't be there as we're dealing with standards, formulas and simple maths (it's not like I'm dealing with analogue signals here), make it all the more important for me to be technically consistent and accurate in my workflow.

Mark Hahn March 20th, 2009 01:50 PM

Quote:

Originally Posted by Thane Brooker (Post 1030608)
Of course I knew you'd relate, I've seen the (in)famous "5D shutter exposed" piece you and Nathan did. Now in my opinion <i>that</i> is a work of art!



*Sigh* And stuff like that certainly gives me gray hairs. Quicktime displays a different default gamma to everything else. I wanted to work out exactly what the difference was, but two different effects in Premiere Pro gave ever so slightly differently results, even though I was typing in the same exact gamma difference. Then you've got Haali Video Renderer giving a slightly different color cast to EVR. And then there are the different way(s) RAW converters may or may not use absolute, relative, perceptual or some other proprietary color shift to avoid clipping saturated colors in the downsample to sRGB, and CoreAVC and FFDShow outputting different frame sizes for no apparent reason. And... and... and...

So all these seemingly 'impossible' variances, which shouldn't be there as we're dealing with standards, formulas and simple maths (it's not like I'm dealing with analogue signals here), make it all the more important for me to be technically consistent and accurate in my workflow.

Most standards allow variations legally. As an extreme example, you can compress mpeg video with a zillion options and the players just have to support all the options.

I come from the analog video days. As they said in Ghostbusters "It's more of a guideline than a rule" when they crossed their streams.


All times are GMT -6. The time now is 10:28 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network