DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   The View: Video Display Hardware and Software (https://www.dvinfo.net/forum/view-video-display-hardware-software/)
-   -   Calibrating Monitors (https://www.dvinfo.net/forum/view-video-display-hardware-software/7095-calibrating-monitors.html)

David W. Jones April 25th, 2007 06:58 AM

Feed color bars into it and calibrate it as you would any broadcast monitor.

Boyd Ostroff April 25th, 2007 07:07 AM

Maybe this will help? http://www.videouniversity.com/tvbars2.htm

Will Hanlon April 26th, 2007 02:17 AM

That's helpful, thanks. Sounds like going by the PLUGE bars for brightness/contrast is right. I love how my TV lacks brightness or contrast controls other than two generic modes... I need a better TV.

Will Hanlon April 28th, 2007 09:12 PM

Alright, I'm still confused. This article says, "Turn the contrast all the way up. The white (100 unit) bar will bloom and flare. Now turn the contrast down until this white bar just begins to respond." What does that mean? Just begins to respond? This is very subjective. Is there any more concrete way of doing this? Any monitors that come already calibrated or could I use some kind of waveform? I'm making these tiny adjustments to my video, but they could all be for nothing if my monitor isn't set up right, which is very disturbing.

Well, I did some checking around. Looks like I could pay a professional to calibrate it or buy something like the SpyderTV to eliminate some of the guesswork. Anyone use that device? I know this is a little extreme, but the project I'm working on is very important to me, and I feel I don't have enough experience or confidence in calibating my TV using simple DVD test patterns to know if I did it correctly... and the adjustments I'm making to the image are really finetuning, which is in vain if my white or black levels are off.

Glenn Chan April 29th, 2007 08:42 PM

When contrast is set too high, the electron beam will lose focus and/or the geometry of your image may distort.

It's easier to see if you take a small piece of paper and stick it on the monitor until you see a little sliver of light.

2- In some consumer TVs, the geometry will always change as you adjust the "contrast" (white level) setting... so it's kind of hard to tell what a decent calibration is.

3- Check that the white bar appears white to you and not grey. If it's grey, you should increase the contrast on your TV since whites should appear white. (This is a necessary compromise.)

4- Some consumer TVs can't be calibrated for hue and chroma ("saturation/color").

Your best bet for a fairly accurate image would be to get a broadcast CRT monitor ($600 up). I wouldn't bother with calibration DVDs or the SpyderTV. I wouldn't waste money trying to fix a consumer TV, because some of them have inherent problems that can't be calibrated away.

Bill Ravens April 30th, 2007 06:44 AM

I find it rather frustrating that no matter how well I calibrate my own equipment, my customer(potential customer) always has a DVD/TV that is out of cal, so the comments about the images being "too bright" or "too dark" are all too common. When I suggest that their system is out of cal, I get suspicious glances of distrust. So, I've given up on being anal about calibrating my TV monitor. To add insult to injury, there seems to be no industry standard with DVD player manufacturers. Some add pedestal, some don't. And none of them give the user a switch to select pedestal or not. Hell, they don't even tell you in the product literature what you're dealing with.

Jay Cowley August 12th, 2007 10:35 AM

Why are the colours different on my TV vs Computer Screen
 
I edited a music video on my computer screen with a Viewsonic LCD screen. I desaturated a lot of the colour, to give the video a certain look, and was happy with how it looked on my computer screen

Then i saved the file, and sent in back to my HDV Tape, to watch on my TV, the TV made it look all colourful again, and took away some of the desaturation I had applied when editing on my computer.

Is it my computer monitor thats too dull, or is it my TV thats making the image look too colourful. I also find this when watching ATSC HDTV on my computers tuner, when watching a show like Lost on my computer, the colours are more dulled looking, whereas when I watch on my TV, the colours are extremely bright and feel enhanced. I'm not sure whether a show like Lost is edited to look very bright and vibrant, or if it is made to look more natural and a bit less contrasted.

Glenn Chan August 12th, 2007 12:11 PM

Some consumer TVs intentionally mess around with the image. It may have different modes... e.g. "vibrant" or "vivid".

LCDs also inherently have a s-shaped curve to their transfer function, so it will add a little contrast and saturation to your colors if it isn't calibrated off.

2- What you see on your computer screen (in your NLE) may be inaccurate in different ways.

Dave Blackhurst August 12th, 2007 02:00 PM

Jay -
One of the challenges in editing is calibrating everything... is your computer monitor adjusted for color and gamma, as well as ambient light? I've got a color adjustment on my Nvida video cards which I really like (it seems to balance the minor inconsistencies between two "identical" monitors - yes two of the same model are slightly "off"...), plus a Pantone Huey which adjusts (hopefully) more accurately and for room light...

I've used the THX adjustments that come on some DVD's to adjust my TV's as close as possible to their "ideal" settings - there are still subtle differences...

However, overall I feel like I'm "closer" to a "correct" image - I still have to brighten up anything from my computer screen just a touch to keep it from appearing too dark on the DVD/TV... or for that matter my printer...

EVERY device has a sort of color curve "built in" by whoever designed it - cameras, monitors, TV's, printers, etc... partly so that when you go buy a device you go "wow, don't the colors pop/look pretty..."

This doesn't mean that the colors are accurate, just the engineers and marketers thought they LOOKED good and they probably tested them to see what sold better... more saturated looks better than "flat". AND as you also note, TV shows and movies are color graded to achieve specific "looks" - thereby achieving an atmosphere or distinctive "feel" to a show or production... "movie look" can mean a lot of things in this context - WHICH movie was that...?

The most you can hope for is to calibrate your systems as best as possible within your budget, and then adjust as you work to get your desired final result... render short portions of your project and test before you go rendering BIG projects...

HTH

Paul V Doherty August 13th, 2007 07:51 AM

In general, the more expensive and pro-grade a piece of equipment is, the less faux "bells and whistles" is has!

Translation: the more expensive, the more nuetral the image

Bill Edmunds November 17th, 2007 10:25 PM

How do I calibrate an HD monitor?
 
I'm using a Sharp HDTV to monitor my HD productions. How do I calibrate it? The normal rules for NTSC SD monitors don't seem to apply... or do they?

Glenn Chan November 18th, 2007 12:58 AM

1- If you are sending an analog signal to the monitor, then that might need to be calibrated.

see
http://www.videouniversity.com/tvbars2.htm

*the blue gel trick doesn't work.

Digital signals don't need calibration.

2- Other aspects of the monitor may not be ideal (e.g. white point), but the monitor may or may not have controls that would let you change that.

3- Your real problem is likely that the Sharp HDTV does weird things to the signal.

Scaling, sharpening, poor deinterlacing (e.g. bob), image "enhancements", non-standard primaries, s-shaped transfer curve (inherent to LCDs, should be calibrated away), low bit depth with poor dithering, etc.

Giroud Francois November 18th, 2007 03:01 AM

whatever signal you send to whatever monitor you use, calibration is needed.
while it is easy with computer (you find USB calibration tools for cheap), it is more difficult with video. The are some tools (like special DVD) to help this.
http://www.ramelectronics.net/html/V...libration.html

Bill Ravens November 18th, 2007 08:30 AM

Glenn...

again you're right in theory but wrong in actual practice.

ahhh, but the display most certainly does. With the plethora of brightness, contrast, color gain, bias, etc., etc., nothing comes out of the box already set up.

And many LCD screens, these days, e.g. Samsung LCD's, come with auto settings and sensors to adjust for room light. Beware of these automated adjustments. You never really know what you're working with.

Bill Edmunds November 18th, 2007 08:38 AM

Quote:

Originally Posted by Bill Ravens (Post 777458)
Glenn...

again you're right in theory but wrong in actual practice.

ahhh, but the display most certainly does. With the plethora of brightness, contrast, color gain, bias, etc., etc., nothing comes out of the box already set up.

Exactly. How do I know if I have my brightness, contrast, hue, color set properly?

Giroud Francois November 18th, 2007 12:59 PM

that is the problem. you never know.
what you know that all your screens give same result (from camera monitor, to editing preview monitor, to you big LCD screen in living room.).
and prefferably.. this must be close to what you see with your eyes.
At least with such calibration you do not know if you are right, but you know that what you get at one end will stay the same at the other end.
Now if you distribute DVD directly to your customer, it is pretty much all what you can do, since you can not guess the settings your customers have.
if you work with professional equipement, it is easier, since calibration actually means something it term of physical settings. (the vectorscope summarize this almost in one view).
What you need to know, is that consumer product (especially LCD ) tends to display very strong colors, most alway with a cast into one red,green,blue primary color. (Sony screens have really hot/bright red, while others screen goes easily on blue).
My advise is to set colors low, so you will not be disappointed on how nice the blue was displayed on you computer screen and how bad it renders on your DVD.
That is for colors.
From the luminance side, you find two schools. The one in favor of bright, but has very poor dark rendering (most of batman movies are a good scale to measure), the other one is in favor of darkness, but tends to burn easily all bright pictures and gives no real black.
a simple gray scale allow to set up your screen, but that does not help on you customers screens, except if you put as goodies on your DVD a little movie with an explanation on how to set the screen properly.(i have seen that on few DVD)
for example in a very dark picture, you can hide a message that people can read only if their screen is properly set.
http://www.drycreekphoto.com/Learn/C...itor_black.htm

Michael Jouravlev November 18th, 2007 03:14 PM

Calibration for cheap:

* Buy Datacolor ColorVision Spyder2express colorimeter (~$70)
* Download HCFR Colorimeter program from here, it is free: http://www.homecinema-fr.com/colorimetre/index_en.php
* Get one of the test DVDs, either burn the image provided by HCFR, or buy a GetGray disk or other TV setup disks.
* Install HCFR Colorimeter onto your laptop
* Read this thread on how to use HCFR Colorimeter program: http://www.avsforum.com/avs-vb/showthread.php?t=737550
* Connect the Spyder to the computer
* Play your test disk and log monitor output with the Spyder colorimeter

You will get several graphs, it is your job to make sense of them and to adjust your TV accordingly. It is likely that you will not have access to all needed controls from a regular TV's menu, you may need to access special service menu.

Below are couple of charts graphed with HCFR, the TV is my 50-inch Panasonic plasma. The results are not perfect but still better than original setup.

Gamma, before: http://www.jspcontrols.net/misc/panasonic/std_gamma.jpg
Gamma, after: http://www.jspcontrols.net/misc/pana...alib_gamma.jpg

Grayscale, before: http://www.jspcontrols.net/misc/pana..._grayscale.jpg
Grayscale, after: http://www.jspcontrols.net/misc/pana..._grayscale.jpg

Color temperature, before: http://www.jspcontrols.net/misc/panasonic/std_temp.jpg
Color temperature, after: http://www.jspcontrols.net/misc/pana...calib_temp.jpg

Primary and secondary colors: http://www.jspcontrols.net/misc/pana...ic_cie_709.jpg

I used "warm" standard preset as it was the closest to neutral. The TV does not have explicit gamma control, so I had to play with black/white levels to adjust gamma curve. I prefer my blacks to be discernable, not crushed, you can see it by the adjusted gamma curve.

As with most plasmas, the green is oversaturated and cyan and magenta are off.

Note, that Rec 709 (HD) and Rec 601 (SD) are different, particularly in CIE colors and in gamma profile. SD gamma is 2.5, HD gamma is 2.2.

The same SpyderExpress can be used to calibrate your computer monitor, luckily if you have reasonably new computer, monitor and OS, this process is fully automatic.

Bill Ravens November 18th, 2007 03:59 PM

As a "professional", one must adhere to the standards of the industry. To try to second guess what an average user might have his/her monitor set to is whistling in the dark. To the best of my knowledge, MOST users are still viewing on NTSC compliant monitors. If this is the case, it is necessary to make your editting monitor conform to this standard, which, I guess is SMPTE. If you're working in HD and the delivery schema is TV, then SMPTE it is. If you happen to be delivering to an HDTV, things change. HDTV isn't SMPTE. I do know that there is a different version of the SMPTE color bars that are used for HDv. The color bars defined for HDv are ITU 709 compliant. Perhaps these are the best standards to use on an monitor intended for HDTV distribution. But, I'm just guessing. I don't really know.

This is where Glenn Chan could jump in and tell us what the color mapping is for HDTV. Is it RGB? And would that be RGB16-235 or RGB 0-255?

Glenn Chan November 18th, 2007 05:16 PM

1- Sorry, I wasn't clear in my original post. There are different aspects of the monitor that can be calibrated... the interface (e.g. composite, S-video, component, HDMI, DVI, etc. etc.) may or may not need calibration... digital interfaces don't need calibration.

Other aspects of the monitor can be calibrated.

The monitor may also have settings that let you change certain aspects of the image processing (e.g. turn off overscan to get 1:1 pixel mapping). This will depend on the menu structures in the monitor.

2- A properly-designed LCD broadcast monitor that doesn't drift will not need the user to calibrate it.

3- The real problem with most consumer monitors is that they do wacky things to the image. They don't look the same as a reference monitor. Put them side by side with a reference broadcast monitor and they don't look the same.
They may also have limitations like limited color gamut, whatever deinterlacing circuit they have, and raised black level that you will never be able to calibrate/adjust/tweak away.

3b- It's generally easiest/best to get a broadcast monitor.
*Many old LCD broadcast monitors have some problems... they are getting better at quite a fast pace. And it looks like this might be the year where they finally surpass the Sony BVM CRTs in performance for HD.
(For SD, a CRT broadcast monitor is a much better idea IMO.)

3c- The exception to 3c might be something like the Apple Cinema Display + eCinemasys's EDP100. The EDP100 is a signal processing box that does the appropriate signal processing to get good color, deinterlacing, etc. the ACD (AFAIK). It has since been discontinued for better approaches (they rebuild the LCD panel and integrate everything into a single device). This approach doesn't work well if you have one of the newer flawed ACDs with the pink cast problem.

3d- You might be able to take a consumer display + add something to it that would make it reasonably good (like what ecinemasys did with the EDP100). I'm not familiar enough with the current options for this (e.g. Decklink) to know how well those solutions work.

Glenn Chan November 18th, 2007 05:46 PM

Monitoring standards:

1- Viewing conditions:
For TV, there might be different standards here. (Not sure.)
SMPTE RP 166
ITU-R BT.710

In practice, this is not always followed.

2- Primary Chromaticities - just a fancy way of saying the exact shade of red, green, and blue. Chromaticity is an objective way of measuring the "shade" of a color. Primaries = red, green, blue

For SD, for NTSC countries (except Japan), the standard is the SMPTE C primaries.
For SD, for PAL countries (and Japan), the standard is the EBU primaries. *Sorry, I don't know the exact name of the standards documents.
For modern HD systems (i.e. not 1035i, not analog HD), the standard is the Rec. 709 primaries. ITU-R BT. Rec. 709 should be the document.

In practice, a lot of HD material is monitored on Sony BVMs which have SMPTE C phosphors. The Sony BVMs are probably the de facto standard (and they have particular shortcomings; though less than most LCDs).

In practice, the difference between the standard primaries is often glossed over. And people don't notice (so in a practical sense this omission works; it's ok).

2b- The original NTSC primaries are obsolete. No consumer display has primaries like those.

3- There is actually no standard defining some aspects of the reference monitor. The reference monitor was always assumed to be a CRT, with the CRT's natural transfer function and motion reproduction.

There are some working groups working on a standard there.

It will likely be something that emulates the CRT (but not its flaws... e.g. not that bright, resolution not that good for high frequencies).

4- Color mapping for HDTV:
Laid out in ITU-R BT. 709. It calls for Y'CbCr (with Rec. 709 luma co-efficients).
16-235 range for Y', for 8-bit formats (black at 16 Y', white at 235 Y'). 16-240 range for chroma.

Bill Ravens November 18th, 2007 07:17 PM

It sounds as tho' there are some fundamental problems that still need to be worked out. For example, if an NLE displays only in RGB, it's going to misrepresent HD colors in preview, if the display is thinking in Y'CbCr, and the NLE is trying to display in RGB...well, you can see there's a problem, no? So the editor has to juggle the color space, during editting, to fool the monitor to display right colors.

But, if understand what you're saying, the best calibration one could do, right now, is to use Rec709 color bar. Provided, of course, the end user is on an HD capable monitor. Most of the world I know, is still using SMPTE specced equipment. Here's a comparison of the SMPTE SD colorbars and the SMPTE HD color bars...
http://en.wikipedia.org/wiki/Color_bars

Care should be taken because the pluge bars are significantly different from the SD version.

Glenn Chan November 18th, 2007 08:01 PM

The conversion from Y'CbCr <--> R'G'B' is defined by the standards documents.

The editor should not have to juggle the color space to get the monitor to show the right thing, in a well-designed system. (By that definition, Vegas is not because you do have to juggle color spaces.)

2- I didn't think about this:
There are different flavours of Y'CbCr depending on which set of numbers they use (based off of the Rec. 601 or Rec. 709 luma co-efficients; there is also a third obsolete set).

Some TVs may let you choose which luma co-efficients are assumed when decoding Y'CbCr signals (because Y'CbCr signals either use Rec. 601 or Rec. 709 numbers). Using the wrong set of numbers will cause dramatic color shifts/inaccuracy and clipping of certain highly saturated colors. It is worth checking that your system gets the numbers right.

One way to check is to send color bars and eyeball them to see if they look correct... though that method is not 100% foolproof. It's possible to make a test pattern that makes it easier to see.

2b- Many consumer sets will always use the Rec. 601 numbers... which is wrong for a lot of HD.

Quote:

But, if understand what you're saying, the best calibration one could do, right now, is to use Rec709 color bar. Provided, of course, the end user is on an HD capable monitor. Most of the world I know, is still using SMPTE specced equipment.
I don't think there is such thing as the Rec. 709 color bar?

2- Color bars should have the same R'G'B' code values in the end... their Y'CbCr values differ. If you originate them in R'G'B', then they should be correct as long as you do the right R'G'B'-->Y'CbCr conversion.
The Y'CbCr values differ depending on whether you used the Rec. 601 or Rec. 709 luma coefficients to encode.

Bill Ravens November 18th, 2007 08:16 PM

here's another set of what someone is calling HD colorbars, a little more credibility than wikipedia

http://www.belle-nuit.com/testchart.html

Glenn Chan November 18th, 2007 11:52 PM

The problem there is that it assumes a particular mapping from R'G'B' to Y'CbCr... e.g. 16 16 16 RGB gets mapped to 16 Y' (and neutral chroma).
A- Some programs (or rather, codecs) will map 0 0 0 RGB to 16 Y'.
B- Quicktime may apply inappropriate color management onto the still.

So those things will screw everything up.

Giroud Francois November 19th, 2007 03:57 AM

electronic color chart are great, but useless if you do not get a paper version to compare with..... which is still the best way to quick calibrate a screen.

Glenn Chan November 19th, 2007 10:13 PM

1- Paper charts suffer from metamerism (e.g. from illumination, and the camera's spectral sensitivities; and most cameras cheat in their color matrix, which is not correct; and then there is metamerism between different observers' eyes).

You wouldn't use them to calibrate a monitor.

2-
Quote:

These days, getting manufacturers to agree on a standard, even for something so universal as color reproduction, is a nightmare. No wonder the Roman Empire fell.
The standards are pretty good in specifying how levels should get converted... it's pretty much followed in the professional realms (software bugs and user errors aside).

In the consumer side of things... the main problem is that the wrong luma coefficients are *intentionally* being used... I believe this is a cost-saving measure in low-cost hardware implementations. The ITU-R BT.709 committee really should have listened to Charles Poynton and not have changed the luma co-efficients.

2b- Manufacturers of consumer TVs don't try to make their sets color accurate / conform to the standard. So that is why we don't have similar color reproduction between sets.

3- One could argue that the bellenuit approach to their test chart and Vegas' approach to levels is not good design.

Daniel Alexander February 11th, 2008 04:48 PM

PAL Calibration
 
Hi,
Ok so i just got my JVC TM H150C after reading great reviews for it, and i was all set for calibrating it due to hundreds of 'how to's' on the net. HOWEVER, little did i know that all these sites i had been seeing where intended for NTSC monitors and often referring to using HUE as a technique where in PAL land the hue switch is not accessible nor necessary.

Well after trying to do it by playing with various functions i have to admit defeat, nearly all attempts are leading me to the same result, which is my video footage looking washed out with exaggerated banding and way way over exposed. What comforts me is i have been reading over on a few forums that other people with the same monitor after just buying are also getting a washed out over exposed image.

I know im going on abit but you can imagine how important this is to me, which leads me on to some more info that can possible help someone help me. I've been generating my pal smpte colour bars through sony vegas 8 and i notice that at the bottom far right corner where theres suppost to be 3 vertical lines of grey and black, it only displays the dark grey one on my JVC monitor. I read in the manual that this is due to my computer missing a codec which i dont understand at all. However I cant seem to find broadcast standard PAL bars on the web anywhere, not smpte ones anyway. Im getting quite worried now.

I understand also that colours will be displayed differently on my computer as opposed to my monitor but i've tried endless amounts of footage and even commercial dvds become way over exposed on my monitor.

Any help would be welcome. Thanks

Chris Soucy February 11th, 2008 05:08 PM

Hi Daniel...........
 
I may be missing something, but wouldn't it be a lot easier to simply plug your camera straight into the monitor and fire up the colour bars on the camera?

This does, of course, require your camera to have colour bars available.

(What? No Canon XH A1/ G1?)

Then there's the other obvious question - doesn't the monitor come with any calibration system of it's own?

Manual?

Web site?

Support?

Seems a strange way to sell production monitors.


CS

Daniel Alexander February 11th, 2008 05:20 PM

Thanks chris, getting colour bars is my secondary problem really, yeh i can get them from my camera but my main concern is the actual calibration process as this is very new to me. Especally seeing that the type of bars my camera outputs (sony ex1) are very different to the bars i am seeing.

After extensive searching on the net it seems to be quite a common theme, there are no clear guidelines on how to calibrate a PAL monitor, only rough sketchy cross conversions from how its done on an ntsc.

Daniel Alexander February 12th, 2008 03:18 PM

ok so i have kind of fixed my problem. It turns out the reason i was getting an over exposed and washed out colour was due to me plugin my composite into video A instead of video B (dont know why this makes a difference but it does). I thought i should mention it as i have found so many people with the same problem i had, so i hope they find this.

Guy Godwin February 28th, 2008 09:38 PM

Calibrating a monitor
 
OK folks in another thread I spoke about just buying a new V8000W LCD Ikan monitor.

Now that I have it I need to Calibrate it. I have seen the name spider tossed out there but I am looking for an inexpensive way and simple way to do this.

Also, I am wondering How can I calibrate a monitor if my camera level has not been confirmed? Do I assume the camera defaults with a good white balance are nominal for all colors?

Anyway, hopefully this can be a usefull discussion for all party's.

Bill Ravens February 28th, 2008 09:57 PM

I used to think a monitor cal using a Spyder or Eye1 was necessary for good color rendition of video displays. The reality is that spectrophotometers are designed to cal screens for PRINTER colors and not NTSC colors. So, a hardware spectrophotometer is not the right way to cal your monitor.

The best way is to use an ARIB multi-format color bar, available in a number of places including the internet or directly from your camera. The correct process involves adjusting black bars called pluge according to these instructions:

http://www.videouniversity.com/tvbars2.htm

Guy Godwin February 28th, 2008 11:22 PM

Bill,
Thanks for the link.
I went through it and got hung up.....
I don't have the chroma control it calls for. All I have is saturation and tint to change the colors and none get me all B&W shades.

Chris Hurd February 29th, 2008 12:35 AM

Moved from Canon XL2 to SDTV / HDTV Video Monitors.

By the way we must have a couple dozen threads on this topic -- will try to find them and merge them all together. Please search first before posting new threads. Thanks,

Guy Godwin February 29th, 2008 07:34 AM

Quote:

Originally Posted by Chris Hurd (Post 835053)
Moved from Canon XL2 to SDTV / HDTV Video Monitors.
By the way we must have a couple dozen threads on this topic -- will try to find them and merge them all together. Please search first before posting new threads. Thanks,

Chris, I actually did search for this item. But I will also be very honest when I did not find as quick as I wanted to I re-posted. But I will dig much deeper.

Thanks

Chris Hurd February 29th, 2008 08:45 AM

Went back five years, found 24 similar threads, merged them together and "stuck" at top of forum index list. Remember when using the Search function to search the *entire* site. Hope this helps,

David Knaggs August 1st, 2008 04:10 AM

Alternatives to MXO for calibrating an Apple Cinema Display (ACD)?
 
This question is specific to the Mac platform.

I was looking at buying the Matrox MXO next week as I have a large project to color correct and all of the footage (35 hours of tape) was captured with the current versions of FCP and QuickTime (6.0.4 and 7.5 respectively). Normally, this would be a "no-brainer" as my calibration requirements are only for PAL or HD (Rec. 709) and I believe that the MXO will calibrate both on an ACD (this large project requires Rec. 709 calibration). So a "working" MXO would really be perfect for me (with the iMac I recently purchased).

But I've just checked the MXO forums today and they still haven't fixed the MXO to work with a current Mac system of FCP/QT (after nearly two months!) and so it can't be calibrated. Unless you downgrade your system (maybe). But that wouldn't work with my footage anyway (captured with the latest FCP/QT).

So I'm wondering what other options are out there for calibrating an ACD (or a Dell UltraSharp)?

Or have most people just been using the MXO?

I know a little about cineSpace (using an Eye-one2 probe and generating a 3-D LUT to load into Color).

But are there any other calibration options people have used successfully?

Thanks.

David Knaggs September 11th, 2008 02:53 PM

Just to answer my own question, Matrox finally provided a fix last Friday (5th of September):

Matrox MXO User Forum :: View topic - Fix for MXO DVI Monitor Calibration with QuickTime 7.5

So I've now got the Matrox MXO and an Apple Cinema Display. Problem solved.

Boyd Ostroff September 12th, 2008 09:06 AM

Just got an MXO myself and ran into this problem on Tuesday. As you note, there was a patch which was posted on their site, and it works.

But here's something odd. I calibrated my 23" Apple Cinema Display to my satisfaction and double checked it a few times. Looked very nice. Then while editing in FCP I had a look at the built-in color bars (on the effects tab). They seem to have a different black level than the Matrox clips. I am not in the same place as that system at the moment, but I'm pretty sure they were both set to 7.5 IRE. I used NTSC bars for standard definition in both cases.

Will have to look at this a little more carefully next week, but has anyone else noticed this?

Michael B. McGee March 25th, 2009 12:41 AM

Quote:

Originally Posted by Bill Ravens (Post 835016)
I used to think a monitor cal using a Spyder or Eye1 was necessary for good color rendition of video displays. The reality is that spectrophotometers are designed to cal screens for PRINTER colors and not NTSC colors. So, a hardware spectrophotometer is not the right way to cal your monitor.

The best way is to use an ARIB multi-format color bar, available in a number of places including the internet or directly from your camera. The correct process involves adjusting black bars called pluge according to these instructions:

- Color Bars

Does anyone disagree with Bill? i'm in the process of finding "true Color" on my monitor for editing and viewing purposes on a Dell LCD monitor.

thanks,
Mike


All times are GMT -6. The time now is 10:43 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network