February 1st, 2007, 08:36 AM
A production monitor with calibration controls shows us the true image the camera records. However, most people view content on consumer grade televisions. A consumer television cannot be calibrated to any degree and has picture enhancements. Why then don't we use a consumer television as a monitor since that is what the end product will be view on?
February 1st, 2007, 09:04 AM
for me, it's like audio.... Your major recording studios will use multiple monitors of different sizes and distances to "see" what their mix sounds like on different speakers. I personally, try to do the same in my post production process. I do have an older CRT Pro Monitor, but I also like to run my footage through some different TV's and such occasionally.
February 1st, 2007, 11:16 AM
With SD, the reason broadcast monitors were used was because basically everyone used them in aquisition and color correct. While the old "NTSC" jokes applied, there was a generally accepted range where things would look good. Because everyone at the broadcast/mastering level used roughly the same monitors (and more importantly vectorscopes and waveforms), the signal being transmitted by most networks would have roughly the same color. What the end user decided was "pleasing" would then be uniformly pleasing across most networks, instead of one being heavily red, another being heavil green, etc... This was the "ideal" situation - transmit the signals in whatever format (broadcast, tape, dvd) to look the same, and let the viewer choose from there.
What has changed now is the amount of content created which never goes through a proper color correct. That process requires calibrated monitors AND measurement devices. Online editors and colorists rely on actual measurements to set the final coloration of their product, not just monitors. Why not consumer monitors? Because broadcast monitors can be calibrated so they look fairly similar to one another regardless of location...or at least that's the idea. More importantly, they display a wider range of colors and offer FAR more adjustability than consumer displays. If you judge your color on a consumer display with a narrow color gamut, you may think it looks fantastic until you watch it on another consumer display with different color response. The idea being - edit and create to the largest possible color palette, and provide your viewers with the possibility of watching your "ideal" signal. Should they choose to warm or cool the picture on their own tube, that's fine - you gave them the choice.
Bottom line is: don't trust the monitor by itself. Learn to read the scopes alongside the monitor, and you'll have a far better looking product at the end.
February 1st, 2007, 12:20 PM
Okay well if vector scopes and waveform monitors are God, that argues for using a solution like DV Rack right?
February 1st, 2007, 12:22 PM
for me, it's like audio.... Your major recording studios will use multiple monitors of different sizes and distances to "see" what their mix sounds like on different speakers.
This is actually the situation I usually find myself in. Different monitors/solutions displaying slightly different versions of the same image. Drives me nuts because I never know which one to trust.
February 1st, 2007, 08:17 PM
Are you wanting a monitor for the field or for color correction in the edit? If its for the edit, most NLE's have vectors/waveforms. If it's for the field, DV rack is certainly the cheapest way to get measurements...