DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Final Cut Pro X (https://www.dvinfo.net/forum/final-cut-pro-x/)
-   -   Which screen brightness should I use? (https://www.dvinfo.net/forum/final-cut-pro-x/534646-screen-brightness-should-i-use.html)

Urban Skargren September 8th, 2017 07:00 PM

Which screen brightness should I use?
 
I have always wondered, when editing, which level of screen brightness is adequate? Not for having a comfortable level for my eyes, but how much should it be so that when I judge brightness on the actual clips I judge correctly?

(in this case on a Macbook Pro 2017)

Edward Carlson September 8th, 2017 07:54 PM

Re: Which screen brightness should I use?
 
Unless you've calibrated your monitor, then it doesn't really matter. You can calibrate it with SMPTE bars, but computer monitors represent and adjust colors differently than TVs do.

Boyd Ostroff September 9th, 2017 09:25 AM

Re: Which screen brightness should I use?
 
I think newer MacBook Pro's can be connected directly to a calibrated external video monitor. This is not the same as using an external monitor as an extension of the Mac desktop however. You would access this in FCPX using the Window > A/V Output menu. See this: https://support.apple.com/kb/PH12783?locale=en_US

Older machines, like the 2012 quad core Mini Server I use, don't natively support this feature. So I use a little Blackmagic UltraStudio Mini Monitor interface that connects to the computer with Thunderbolt.

William Hohauser September 9th, 2017 04:20 PM

Re: Which screen brightness should I use?
 
Apple includes 709a, 2020 and P3 color calibrations with their OS now. Choose one of those and then use a grey scale chart with either a grey ramp or enough iterations of grey to see if you are losing dark greys or near whites. This will be adequate for many editing jobs but not totally reliable for high level color correction.

David Knaggs September 9th, 2017 09:35 PM

Re: Which screen brightness should I use?
 
I'm glad that you chimed in on this thread, William. I've spent the past few weeks re-examining what I know about color calibration using the screens on my iMacs and secondary displays (an Apple Cinema Display and a Dell Ultrasharp) because I feel my thinking could still be mired in the old CRT (Cathode Ray Tube) days, when CRT tubes were all anybody was watching, even on computers at that time, and I must have quite a few gaps in my understanding in 2017. I really value your opinion and expertise in these matters.

I've been pondering:
a) How close can I get with the screens on my Mac (and secondary displays)?
b) Or should I just bite the bullet and go down the route of getting a modern pro broadcast monitor (maybe something like a Blackmagic SmartView) plus, I guess, a device which can convert the signal out of the Thunderbolt port and convert it into an SDI feed that the pro monitor can accept as input (maybe a device like the one Boyd was talking about)?

It seems to me that, in my line of work, I'm likely to be dealing with 4 different types of color spaces these days:

1. CRT color spaces, such as PAL and NTSC which require those bars for calibration plus the blue-gun or blue-only modes. About a decade ago, I was seduced by the Matrox MXO promo which promised to calibrate my Mac secondary display into an equivalent of a broadcast monitor. This sounded like my dream come true! It gave you a blue-gun feed and everything. I wasted about 12 months with their service reps trying to get it to work. It was always way off. Yet they would never admit that their product didn't really work as advertised.

But in 2017 I'm wondering how relevant PAL and NTSC bars and calibration are. Who still uses a CRT screen? It's all HD and UHD TVs, computer screens and iPads today. I would imagine that PAL and NTSC colour spaces are redundant now. (I'm not talking about the frame rates here, just the color spaces.) But I noticed that the modern Smartview pro monitor has a "Blue only" feature. So maybe I'm wrong about those color spaces being redundant? Or were they just making it backwards-compatible?

2. HD TV colour space. This is Rec. 709. Some years ago, I ponied up for a Spyder 3 Elite, because the Elite version promised calibration for Rec. 709. Although their addition of Rec. 709 seemed like an afterthought to me. I feel that those things are 99% designed for stills photographers so that they can make a photo which can be accurately printed on paper, rather than for a video screen. Just an opinion. I could be wrong. But I never trusted the results of the calibrations with this device. I'd calibrate the iMac with their version of Rec. 709 and then do the same thing with the secondary display. And the images both screens produced looked markedly different. Perhaps this was due to operator error? Although I noticed that the recent promo for the Spyder 5 sensor bragged that it gave a "55% improvement on the Spyder4 system". And I was using the even-earlier Spyder 3. So perhaps that was the problem? But the other thing that bothered me about this method was what if you or someone else accidentally altered the brightness control on the keyboard or on the side of the secondary display (when you went to turn it on or off)? You wouldn't be accurate again until the time of your next scheduled calibration.

3. UHD (4K) color space. This is Rec. 2020.

4. Computer Display color space. This is Adobe RGB, I believe. Also, I imagine that this color space is also used by iPads, iPhones and their Android equivalents. But I'm guessing about this last bit.

I'd noticed that the latest Mac OS gives an expanded array of color spaces. I'd been pondering whether to use "HD 709-A" or "Rec. ITU-R BT.709-5" (which seems a bit darker than 709-A). Your answer has solved that. Thanks!

I also like that you mentioned the use of a grey scale chart. I think that would solve my earlier question of "What happens if you accidentally alter the brightness control on the computer or secondary screen?"

With your method I imagine that, if the grey scale chart showed a problem with the dark greys or near whites, you would simply adjust the display brightness key (or button) up or down.

Is there a particular grey scale chart that you would recommend for putting up on the screen for this calibration?

Sorry for the long post but I've been intensely researching this subject online for the past few weeks to overhaul and update my knowledge about this area and I've been extremely disappointed, to say the least, with the information that I've found out there.

Perhaps we should call this thread "Display calibration in FCP X for varying budgets"?

One pleasant surprise of this recent review has been how much the price of pro broadcast monitors has come down in the past decade. I think a CRT broadcast monitor a decade ago cost $8,000 or $10,000 (in Aussie dollars) which is why, at the time, I wasted about $1,400 on a Matrox MXO box which promised the same result. Today that same money would get me a SmartView broadcast monitor (which are in the $1,200-$1,500 range). Although I don't know how much the SDI converter would cost.

William Hohauser September 9th, 2017 11:05 PM

Re: Which screen brightness should I use?
 
Apple has apparently put some thought into this and the high end laptops and iMacs have high quality displays that are versatile enough to cover much of the color ranges needed for video production. How close to a broadcast monitor? I don't know.

The Spyder calibrators don't have a great reputation and my experience is that they are OK on certain monitors to get close to Rec709a. They work on my NEC monitors but once presented with 2020 RED footage I found the Spyder calibrations too dark with the low grays. The Spyder calibrator is useless with my 17" MacbookPro and the Samsung monitor connected to it. Those I calibrated by eye. I am not sure how HDMI is handled directly out of a computer. My Samsung monitor has fewer color selections in System Preferences when connected by HDMI than DVI.

There are computer monitors that claim video production quality that are reasonably priced. By reasonably priced I mean around $1000 instead of a much more expensive broadcast monitor with SDI inputs. Here is one I am interested in: https://www.benq.com/en/monitor/vide...ion/pv270.html
The supposed best way to gauge color is with a SDI output converter from the computer that interfaces with your NLE and a good SDI monitor. Rather expensive still. If fortunes look up this year I just might spring for the BenQ monitor.

Look up gray scale chart on Google, you'll find some decent ones posted by helpful professional.

David Knaggs September 10th, 2017 12:35 AM

Re: Which screen brightness should I use?
 
Thanks, William!

I agree that Apple are putting quite an effort into this. I currently edit with a pair of late-2013 iMacs (in two different editing locations) so hopefully these are recent enough to get a good result with these recently-expanded color space choices.

You know, I was looking at that very same BenQ monitor last week. A local company called Image Science sells them for A$1,299 which looks like great bang for the buck! It's interesting that, like you, they don't recommend Spyder 1, 2, 3, 4 or 5. (They have a specific list of "Not Recommended Calibrators".) Although they highly endorse the i1Display Pro. I might drop into their premises next week and have a chat. I wasn't looking to get a monitor so soon after springing for my new 4K camera (and accessories) but it's certainly moving towards the top of the list.

So it currently looks to me that if you want reasonably accurate color correction in FCP X, a range of calibration solutions for varying budgets might be:

1. Your solution of selecting the HD 709-A color space and then using a greyscale chart to adjust brightness as necessary. (Costs nothing.)

2. Using the i1Display Pro to (I hope) calibrate to a Rec. 709 color space. (Cost A$349. Or costs US$259 at B&H.)

3. Getting a BenQ PV270 or similar monitor. (Cost A$1,300. Or costs US$900 at B&H.)

4. Getting a pro broadcast monitor plus a converter for the monitor input. (Cost unknown to me at this point.)

Boyd Ostroff September 10th, 2017 09:48 AM

Re: Which screen brightness should I use?
 
I wrestled with a lot of this a year ago when I upgraded my system. Funny - I also got a Matrox MXO shortly after it was released and used it with an Apple Cinema Display as recommended. I had good results with it, although I didn't have any other broadcast monitors to compare it with. The MXO was discussed back then (can't remember.... maybe 2008?) and I spoke with people who said they were using them professionally and they looked almost identical to very expensive broadcast monitors.

I know there were several versions of the Apple Cinema Display, and there was some discussion at the time which was best. I'm still using the one I got more than 10 years ago (with a DVI to HDMI adapter) on my quad core mini and it looks surprisingly good. It was very expensive back then, but turned out to be a good investment.

Last year I replaced several 10 year old consumer level HD monitors and was surprised at how inexpensive they've gotten. One of them is a Vizio smart TV from Best Buy that cost less than $150. The interesting thing about it is the advanced color configuration settings in the menu, you can even do blue only, and this is in the normal menu system, not some secret service menu.

Anyway, I ended up getting the Blackmagic MiniMonitor - it has both HDMI and SDI and only cost about $140 from B&H last year. I also got a Sony LMD-2110W production monitor and have been really happy with it. Had hoped for a 24" screen, but that was a big jump in price so I settled for 21". IIRC it cost about $800 at B&H which seemed very reasonable. Couldn't find any real reviews of it but it had good comments on various forums.

One issue you also have to consider is audio. That is just an afterthought on this monitor, only has mono input and a terrible little speaker with no audio pass-through. Now if you take the audio directly from the Mac, of course it will be way out of sync. Legacy FCP has an offset setting for this, have not checked the menus in FCPX. I tried this and found it was pretty dicey so I didn't trust it. I ended up getting a little box from B&H called a Comprehensive 4KX2K HDMI Audio Extractor. The HDMI from the computer plugs into this and passes through to the monitor and it has two RCA plugs and TOSlink audio output. I use the RCA plugs to connect to my powered studio monitors. The specs show latency figures in the nanoseconds for this device, and it works well.

So all combined, I spent just a bit over $1000 for all this stuff, which looked like about the best I could do on a limited budget. But in pondering everything, I pretty much came to the conclusion that calibrated monitors aren't terribly relevant to my work, which is for the web or digital projection where I can personally tweak the color. People are going to be viewing it on so many different devices and screens, it's hard to imagine what to expect. I look at my videos on a bunch of screens at home before finishing - MacBook Air, PC with VGA monitor, 24" Samsung monitor on my media server, 46" Sony LCD, my old Apple Cinema Display and my iPhone 6s+.

Spent some time comparing the Sony LMD-2110W side by side with other screens, and found it was virtually impossible to make them match across a broad range of content. Maybe William can help with this, but I think the LMD-2110W has a more linear response than typical consumer screens?

This screen has an optional SDI module that I've considered getting (fairly expensive, maybe $300?) but have been reluctant since I don't know if the Blackmagic box will still provide HDMI audio while using the SDI port. The monitor does have audio pass-through if you use SDI, but it's only mono, so that's no good.

One other thing, if you use HDMI input on the LMD-2110W, then you don't have access to the full color calibration settings. I don't understand this, but apparently it was intentional. Is there some reason why this couldn't work properly over HDMI, or was it done to keep the price down? If I connect via component from my XDCAM-EX or Sony tape deck, then I have full color calibration menus. I believe this is also the case with SDI, so that's one reason it might be worth getting the SDI module.

William Hohauser September 10th, 2017 08:34 PM

Re: Which screen brightness should I use?
 
SDI is the most reliable way to be assured that a video standard signal is getting to your monitor. A dedicated box like the Blackmagic is going to reduce the possibility of selecting the wrong color space as well.

And no, there is no way to completely balance different brand screens to each other unless they use the same flat screen component and have similar electronics powering the screen. Consumer flat screens are tuned to receiving broadcast and equalize the signals to make the viewing experience easier for the consumer. Manufacturers feel that consumers like contrast so the dark grays tend to black out. Vizio monitors (I have one at home) are irritatingly overcompensating for image variation. I turned all the automatic functions off so the image wouldn't continuously adjust itself. The Sony monitor is a pro monitor and yes it does have a more linear response. You are not going to get a $150 monitor to respond like it, maybe close at best.

David Knaggs October 23rd, 2017 04:47 PM

Re: Which screen brightness should I use?
 
Apologies for the lateness of my reply. I was caught up on a couple of large projects and am only just now properly catching up on things.

1. William's solution (Rec 709A and the grey scale chart) was a really good one, in my opinion. On both of the iMacs I use, I ended up setting the screen brightness to 10 out of 16 after using the grey scale. And both the Cinema Display and Ultrasharp matched up pretty well with their respective iMacs using William's method.

2. Boyd, I remember getting excited reading similar things online, before I bought the MXO, about people saying how close it looked to a broadcast monitor, etc. The clincher for me was one article where a guy said he filmed something in his garden and was now looking at the footage on his MXO set-up display while comparing it to the view through his window of the garden, and he said he couldn't tell the difference! After reading that, I think I rushed out and bought the MXO the next day.

But the reality of the MXO was blown-out highlights on the actress's face. And this was footage shot by an excellent DP, who would always nail perfect exposure. A few weeks later that same footage was shown at a JVC event (I had one of the first GY-HD100 series cameras in the country) and they showed it on their latest projector, which was calibrated by the JVC engineers, and that footage looked perfect.

I would get up in the middle of the night and call Matrox in Canada and let their engineer remotely take over my computer (on a number of occasions) in a bid to fix this, all to no avail. Then he eventually gave up and handed me over to a guy in England, who wanted to do exactly the same things as the guy in Canada. It all fizzled out from there, but at no stage did they think it was a hardware problem or offer a replacement box.

Ironically, at one point when the Canadian engineer had taken over my computer, the blown-out highlights disappeared from my Cinema Display and I said to him, "That's it. You've fixed it! I'll be happy working with that from now on as a calibrated screen!" And he said, "Dave, I disconnected the MXO feed. You're looking at the regular computer feed into your Cinema Display."

So I guess my point is, unless something looked glaringly bad as in my case, how would you know if your MXO solution was really calibrated or not?

Anyway, looking forward, I think William's is an excellent immediate solution and in the New Year I'll look at upgrading my set-up, such as an i1Display Pro and BenQ PV270.


All times are GMT -6. The time now is 12:15 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network