May 22nd, 2006, 01:59 AM
Okay, I know these are designed as televisions, but the fact that the new Bravia X series from Sony are said to be native true 1920x1080 resolution with a extremely wide colour gamut might possibly make them good production monitors as well.
May 22nd, 2006, 01:47 PM
Wide gamut without color management means non-standard colorimetry.
If you're doing work for print or filmout *and* using a color management system (i.e. with a 3D LUT) then wide gamut is desireable. Otherwise if you don't correct for it, then you're definitely not getting the standard colorimetry.
What I mean by that is that the color of the primaries (red, green, blue) will differ from the engineering standards. Wide gamut implies that the red, green, and blue will be more saturated than the standard. For color management, this is desireable because when the gamut isn't large enough (the opposite of wide gamut), that monitor cannot reproduce the primaries of what you want to achieve. That monitor won't be able to get saturated enough in one of the primaries.
If they are serious about color, they'd publish the chromaticity co-ordinate for the primaries (i.e. like the Sony Luma series LCDs). They define a color precisely... so you'll know how well the monitor would be able to achieve what you want.
*Incidentally, the Luma series LCDs aren't great for color reproduction... Sony's own literature shows that they aren't that close to the standard (i.e. rec. 709, or SMPTE C).
2- On a practical level, the thing to do would be to sit a Bravia beside a reference-grade monitor and see if it's drastically different.
On the other hand...you could also go by Sony literature. They say that the BVM-A series CRTs should be used over the Luma LCDs for critical monitoring. And you know that the Luma series LCDs are going to be better than the consumer Bravias.
May 22nd, 2006, 02:47 PM
The Lumas look good. Although the trouble at the moment is that none of the monitors around at the moment can do native 1920x1080 res. So like colour, if the monitor is not bang on 1920x1080 for high def editing then scaling will be done, therefore not giving a true picture to judge. Even the top end Sony CRT's can't display the full high def resolution. Even the ECinema display scales the image. It is rather a ridiculous state of affairs really.
I wonder if these new Bravia X sets will mean that Sony will update their Luma range to have true 1920x1080 resolution? Or even a professional version of the Bravias.
Are there any true high def reference monitors out there (whatever the cost)?
May 23rd, 2006, 06:22 PM
Even the ECinema display scales the image.
My understanding is that they give full HD resolution... the website shows 1900X1200 pixels.
Few cameras actually shoot the full HD resolution anyways... so a Sony CRT should be serviceable? The 32" BVM-A does do 1000 TV lines/pH, which works out to almost 1900 lines horizontally I believe.
The ecinemasys comparison between LCD and CRT is based on a 24" CRT (which isn't the latest series/model... it's a BVM-D24E1WU).
2- Just speculating here:
Can't one of the Dell LCDs do 1920X1200 and accept component in?
Or use a Blackmagic or Ecinemasys box and a computer LCD to see full resolution. The LUT on the blackmagic box is interesting... if you can load steve shaw's log curves into a cinealta camera ( http://digitalpraxis.net/ ), you can use the blackmagic box to get a guestimation of what you're looking at.
*If you shoot with the log curves, accurate monitoring on set will be difficult without a lot of money.
For color accuracy in the edit bay, get a CRT with the right phosphors/colorimetry (the phosphors define a particular colorimetry).
For the NTSC world, everyone is still using SMPTE C.
For the PAL world, I presume it's EBU.
709 phosphors/colorimetry are supposed to be the standard for rec. 709 formats (i.e. most HD formats). Actually that's kind of interesting, because no one makes CRTs with 709 phosphors. And the LCDs that do 709 colorimetry, aren't that close.