DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   High Definition Video Editing Solutions (https://www.dvinfo.net/forum/high-definition-video-editing-solutions/)
-   -   Equipment and Steps for Color Grading for Film Out? (https://www.dvinfo.net/forum/high-definition-video-editing-solutions/129253-equipment-steps-color-grading-film-out.html)

Peter Moretti September 3rd, 2008 12:34 AM

Equipment and Steps for Color Grading for Film Out?
 
I am working on a documentary that I hope will get a theatrical release. To that end, I'd like to try to do as close to a final grade as I can myself. Even if I get it "wrong," I'll be a lot further along with understanding the process than if I don't try at all.

I'm cutting on a pc with Avid Xpress Pro, but will probably be upgrading to Media Composer. I have Boris Red, Boris Continuum Complete, TMPGEnc Xpress 4.0 and am looking to add Color Finesse. I also use Sony Vegas from time to time.

I want to try using Avid for the final grade. I know this is less than ideal, but the alternatives all seem very expensive, so I'm willing to try to make Avid work (I'm hoping Color Finesse will help significantly).

I understand that I need a calibrated monitor and that CRTs still work better than all but the most expensive LCDs. I've read about people using TV monitors like Sony's FW-900 for grading.

So here are my ?'s:

1) If I use a high quality LCD, do I attach it just like I would a normal monitor or do I use some other type of attachment? E.g. HDMI, HD-SDI.

2) How would I attach a tv monitor to my computer for grading?

4) Do I need to apply some type of conversion (a LUT?) to the image before grading for a film out?

5) What format is normally used to deliver the final project to the production facility? E.g. HDCAM SR.

I realize these are very novice ?'s but we all have to start somewhere.

THANKS SO MUCH for your help!

Bill Ravens September 3rd, 2008 06:05 AM

Quote:

Originally Posted by Peter Moretti (Post 928548)
So here are my ?'s:

1) If I use a high quality LCD, do I attach it just like I would a normal monitor or do I use some other type of attachment? E.g. HDMI, HD-SDI.



2) How would I attach a tv monitor to my computer for grading?

4) Do I need to apply some type of conversion (a LUT?) to the image before grading for a film out?


5) What format is normally used to deliver the final project to the production facility? E.g. HDCAM SR.


I realize these are very novice ?'s but we all have to start somewhere.

THANKS SO MUCH for your help!

1-HD-SDI is probably best, feeding a production monitor(very expensive.) You could probably get by with an HD TV monitor, via an AJA card. The best I've seen is a mojo card on avid...also very expensive.

2-practically speaking, AJA XENA or BMD

4-YES...but....Transfer to film is somewhat of a black art, when it comes to color timing. You're best approach is to rely on a color timing expert at the film house when you're ready to go to print. You'll need to be there to guide them on the colors you want.

5-Depends on the Film House. You need to identify who they are and speak with them beforehand. HDCAM SR is one choice, as is Cineform HEO HD. Again, depends on what equipment the film house has and whether you will make a film print from digital or whether you want to cut original film. The film house may provide you with a DNX36 proxy file to edit, after which you send them the EDL for application to the HDCAM tape.

Martin Chab September 3rd, 2008 12:03 PM

Peter,
The advantage to use a LUT for CG is that you will see your footage close to how it will look in the end print so, if you can use it I strongly recommend to do so. The lab should provide their LUT free of charge to match your system to their printer/scanner calibration. Cinema transfer is a big world full of possibilities (LIN, LOG, 3dLUT, LUT, internegative, direct print to positive, diverse emulsions, etc, etc) so maybe the best way is to have a nice and long chat with your lab.

Peter Moretti September 6th, 2008 04:27 AM

Thank you all so much for your replies.

One thing I really don't understand is, is wouldn't the image out of the video card's DVI port be at least as accurate as the image out of a card w/ HD-SDI out?

As DVI is uncompressed RGB, while HD-SDI is 4:2:2 YUV. I can understand using HD-SDI to use a broadcast monitor, but a very good computer monitor (e.g. Sonys FW900 CRT or one of eCinema's current offerings) connected directly to the video card should be able to work very well, no?

Thanks again.

Bill Ravens September 6th, 2008 07:32 AM

Speaking very generally, monitors with HD-SDI ports are production monitors, whose design is very exacting with the color produced. Monitors with DVI inputs are generally for consumer computer displays which do not meet the exacting standards of production monitors. HD-SDI is 10-bit, by definition, and therefore i is supposed to reproduce billions of colors, as opposed to 8-bit which does millions. I've used both HD-SDI production monitors and consumer HDTV's. A good HDTV comes close, but, doesn't quite reproduce images like the production HD-SDI monitor. In particular, the black level is quite striking on a production monitor.

Peter Moretti September 6th, 2008 08:36 AM

Bill, we really do have to stop meeting like this ;).

I follow what you're saying, except I believe there may be a difference between 8-bit and 10-bit that's being a little bit overlooked. HD-SDI is 10-bit 4:2:2 (YUV). While DVI is 8-bit 4:4:4 (RGB). So you're getting more finely graded color with HD-SDI... but higher color resolution w/ DVI.

Does it matter? I'm sure the monitor matters more.


All times are GMT -6. The time now is 09:01 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network