DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Non-Linear Editing on the PC (https://www.dvinfo.net/forum/non-linear-editing-pc/)
-   -   CRT monitor for color correction (https://www.dvinfo.net/forum/non-linear-editing-pc/470517-crt-monitor-color-correction.html)

Raymond Jebb January 5th, 2010 09:31 PM

CRT monitor for color correction
 
Hello guys, I just bought JVC H150CG monitor to work on a post-production of my short film. This is the link to JVC page: JVC Professional Technical Description page


This monitor has BNC inputs and the previous owner told me that he used to connect his computer via VGA-to-BNC to this monitor. I have a MACBOOk with Mini-DVI connection...

I have two questions:

1. Is connecting via VGA-BNC or DVI-BNC the best possible solution for color fidelity, etc?

2. I couldn't find any Mini-DVI to BNC adapters, so I'm assuming I can connect via Mini-DVI to DVI and then attach DVI to BNC cable. Would this whole set-up work and if so would it be acceptable quality for color grading?

Many thanks

Marty Welk January 6th, 2010 12:35 AM

ohh i am soo confused.
from what i would see if thinking of getting this monitor for HD work.
you have the normal Video And S type inputs.
then with added board thing, you have ANALOG RGB or COMPONENT

on the computer side:
DVI outputs Can be available in Both analog and Digital but not ALL DVI outputs out of computers will do the analog, does a Mini DVI have the analog components or the adpatiation TO analog VGA RGB?

Which leads to this
That isnt a HD monitor, Sure it has great horizontal resolution, but computer OR Camera, or board from the computer or Adapted signal, this is a SD monitor with 525 NTSC lines and 625Pal lines.
from what i can tell, and i have stuff like this, you will not be running some magic HD RGB or HD SDI or HD anything to it. Although you can run a SD signal to it, adapt a HD signal into it, get a still wonderfull picture with it, still use it for many things.

now that doesnt make it Useless at all, but i bet that you are going to be dissapointed , mabey for no real reason that you wont be replacing some high faluting "PRO" hd monitor costing 10 times as much with it.

Marty Welk January 6th, 2010 12:42 AM

now let me step out on a "oh you bad and shouldnt do that" limb.
What the heck do you need a professional CRT monitor to Color grade a film/video that will not be played on a Professional monitor anywhere :-) cept in a broadcast station.

secondly , why would you revert back to the excelllent color and contrast reproduction of a CRT for color grading, WHEN the end user is going to be using some rotten LCD screen Thing , that cant even get the full colors that a CRT can, even the light behind the LCD doesnt even have a full spectrum :-)

The projector that will project your film at various locations, will be DLP, LCD, 3X LCD , some new lazer thing, A OLED, or Plasma Screen. probably Not a 3X RGB CRT projector. yes it is likly to be a Digital projector, with all the problems presented with digital.

I am not AT ALL saying you made a huge irreperable error, and now have a completly useless professional monitor, as you might deem that i am saying.
but i will say that even if you use a Wonderfully full color CRT monitor with all its vast capabilities, you still need to know what it will look like on your average LCD or Digital type of device.

Perrone Ford January 6th, 2010 12:49 AM

I use an SD monitor for color correction all the time. It's not idea, but it's just fine other than being small.

Computer based video (anything coming out of the DVI) has a different color space than what would come out via firewire, or analog video output. So forget getting a DVI -> analog adapter. I don't know what NLE you are using, but you have a few options on how to make this work for you.

Essentially you need to get a broadcast color level signal out of your machine, and into that monitor. You can do this in one of two ways.

1. Use firewire. If you choose this route, you will a firewire connection to your computer, and a device that can translate that to an analog signal. I used my DV deck for this. it works great, but might be an expensive solution for many home users. Though I imagine the prices are falling rapidly these days. Another alternative is to use a camera. You connect the camera to the computer via firewire, and the camera to the monitor via S-Video or component. The drawback of this is obviously that you will be using a camera and it's going to be on a lot. darn near any camera should do the trick, so maybe something on Ebay would work.

2. Use a dedicated video board. There are a number of these around, and SD based ones are probably pretty cheap. Essentially you install it in your PC, connect it directly to your monitor, and then tell your NLE to send output through that card. A company called Blackmagic makes numerous cards for this purpose and they are excellent. You can even get combo SD and HD cards so that should you decide to move up to HD monitoring, you can keep the same card.

Again, there is absolutely nothing wrong with using your SD monitor for your work. I do it all the time and the colors stay true. I am then able to burn BluRay's with confidence, and I check my BluRays against the numerous LCDs and Plasmas at the office.

Good luck with your new monitor.

Perrone Ford January 6th, 2010 12:54 AM

Quote:

Originally Posted by Marty Welk (Post 1468537)
I am not AT ALL saying you made a huge irreperable error, and now have a completly useless professional monitor, as you might deem that i am saying.
but i will say that even if you use a Wonderfully full color CRT monitor with all its vast capabilities, you still need to know what it will look like on your average LCD or Digital type of device.

A calibrated, professional monitor is a known quantity. It allows the person creating the video to have an accurate picture of what the colors look like. And it's repeatable time and time again. Yes, the work will look different on a Plasma, an LCD, an LED, or a projector. Home users do not calibrate their TVs and so the look is very variable.

But go to any TV station, film studio, or anywhere else professionals work with video, and you'll see exactly the same thing. A professional monitor, professionally calibrated, so that the person working get get repeatable and accurate results. What happens to the signal when it hit's the users home is another matter. But it is our duty as professionals to provide a broadcast legal signal that give the best color, contrast, and fidelity we can.

Marty Welk January 6th, 2010 12:56 AM

Sure, but when i analyse my own signal on a CRT it is a world of differance from the LCDs out there.
i can pinch a signal into looking good on all the LCD screens, and it Still looks great on the CRT, reverse that, and i can make a signal that looks great on the CRT, and it is ruined on various LCDs.
without seeing what i am going to look like at the end user, i might as well put a blindfold on. probably a $5000 one too :-)

Perrone Ford January 6th, 2010 01:13 AM

So, if you plan to deliver to 5000 BluRay users, 200 DVD users, untold numbers of folks online at Youtube, Vimeo, and other places, what do you calibrate to?

Do you calibrate it to the couple of LCDs you have? Or do you calibrate to a known standard? The same standard the local TV broadcasters are using. The same standard ESPN is using. The same standard HBO is using. If you calibrate to the same standard as everything else delivered to that TV, then your material is going to look like the rest of the material going to that TV.

Let's assume for a moment that you calibrate it to your uncalibrated LCDs around the house. And let's say that all 3 of the LCDs have the reds turned down because you don't like too much red. So you calibrate your video to that. It gets delivered to someone who has their reds set at a normal level. Now it appears that everyone has wildly red skin, but the rest of what they see on broadcast doesn't look that way.

My advice to the OP is to calibrate to the same standard as the broadcasters. If someone has their TV set up wacky, then your video is going to look just as wacky as the evening news, and in exactly the same way.

Marty Welk January 6th, 2010 01:22 AM

LCDs will White Clip, Black Clip, Look like 16bit color when it comes to smooth long gradients. Not to mention the cute "AUTO" changing thing digital expansion and other digital goodies, that when ON react COMPLETLY different to a change in the signal.

Sure there is Broadcast specs, but the broadcasters themselves arent jamming thier signals up into white clip on a LCD are they?

Why would i set monitors that i will compare what the average user will be seeing, to Red , or Green or anything else unless i want to see what happens when THEY do that.

These Digital contraptions dont have a dial wheel for phase anymore, they come with many preset that users will decide on, then Mabey adjust from there for thier preferances.
All i have to do is have a few of them on Standard, switch them up to vivid, try the sports mode, the cinema mode, and see what of my "Broadcast spec" picture doesnt any longer fit into "Digital reality of today".

If thier Standard in thier opinion is offset from the manufactures standards are set, well then they will still get the preferance they selected. all i need to know is when THEY set thier LCD screens that way, that MY signal still looks the way i intended.

there are "legal" broadcast signals, and there are signals that look good on many LCDs, and THEY are still legal

Perrone Ford January 6th, 2010 01:28 AM

OK Raymond,

There you go. Two varying opinions. Choose one of those, or maybe a different one altogether.

Best of luck.

Josh Bass January 6th, 2010 02:50 AM

Are you guys saying that an SD CRT is ok for HD color correction? If I ever upgrade my cam to something HD I'll want to accurately be able to color correct it . . .you're saying this would all work out?

Marty Welk January 6th, 2010 03:32 AM

You certainly could see the HD color on a SD monitor :-) even if you cant see the HD resolution. If you can get the signal TO it, and to it without it changing somewhere to achieve that. like in one more D/A converter thing or some Codec item that makes changes itself trying to look good.

Josh Bass January 6th, 2010 03:43 AM

That's what I'm wondering. . .does doing whatever you need to do to downconvert (let's assume typical firewire to D/A converter (a little camcorder in this case) out to monitor chain) to downconvert the HD signal alter it?

I've worked recently on shoots where we've used an old school SD CRT monitor with an EX-1, and for whatever reason the footage that looked fine off the CRT came out excessively warm in reality. This may have been due our misunderstanding of how to calibrate the EX1's ARIB bars at the time, though.

Resolution would be nice, but you can monitor that in your NLE. If I could get accurate color on an SD monitor and not spend $8000 for an HD monitor that's equally accurate that'd be sweet. This is all purely theoretical right now, of course.

Marty Welk January 6th, 2010 03:55 AM

Bars , oh my, another thing i dont understand , and shouldnt even discuss around real professionals . Naughty naughty.

Bars - a signal generator in the camera, Usefull to Match Analog devices over long wires and such, and possibly to calibrate a monitor that your viewing with. Completly useless for analising things like compression tossing colors out left and right. bars, without gradient color and Luma sweeps are just cartoons, which work out great if your filming the simpsons :-)

Internal Camera signal Processing - is it tied to the bars? do the bars adjust when you adjust all that Digital Chip to output signal processing? Set up a "Cine" signal processing, and the bars then serve what purpose in relation to the processed signal? Do the Bars and the Video match in black level when you changed the black level in the camera to some Cine version of it? Does the color match when your cine version is some greenish thing? Does the level of colors match when the cine version of the processing is now putting the color down a bit (even if it is now normal)?

Digital signal - doesnt change Phase or color depth or video level , because it is not analog, either you get a whole Digital signal at the other end of a wire , or you dont get anything, or nasty blocks and artifacts (nothing usefull). Analog had all that stuff going to heck with long length cables, it had to be adjusted to make it "right" again.

soo bars, what exactally are they for now :-) err setting the monitor with a signal generator from the camera, on a monitor that is Calibrated ?? ooookaaay , that shouldnt take to long.
Then switching back to the camera which can be set any number of ways that arent any longer a "standard" .

For checking problems with non-digital signals over wire? like Component/composite/S wherin if you were having problems pipeing a long analog signal, you would now use digital instead? so why would you need to check a digital signal for a change, it is either there or it isnt?

Ohh i am sooo confused. This analog to digital conversion thing, its all just in my mind, it isnt really happening like this is it? it cant be , it must be a bad nightmare.

I dont know if you have ever seen this, but the Broadcast station asks the editor to put "Bars" on the front of thier video, so they can Easily adjust the color/luma/phase for the video.
so the Editor puts a set of perfect bars on front of a horribly adjusted video :-) the broadcast station which required the bars, now adjust to the bars, the rest of the program goes out in its horribly adjusted state :-) lol .

of course professionals know the correct way that this is all supposed to be, i am just indicating how the "Machine" is only as usefull as the human running it again.

Josh Bass January 6th, 2010 06:12 AM

Camera bars SHOULD always be consistent, despite anything you may do to camera settings. . .those colors should always be at a certain luma, chroma, etc. So regardless of cam settings, if you adjust your monitor to the cam's bars, what you see should be what you get.

I'm not sure about that no phase or video level in digital thing, that sounds wrong to me. There is no phase in HD, or at least no phase control on the HD LCD Monitors I've used (and from what I understand the ARIB Bars - which are HD NTSC color bars - the EX1 generates, don't have a way to judge phase adjustment).

Don't know about some of your other queries.

D/A converter is how you get something coming out of your NLE to show up on a CRT monitor. You go firewire out of computer, into D/A device, and then some kind of video cable (BnC, RCA, svideo) out to your monitor. That give you an "honest" feed, vs taking VGA out or some other craziness. Maybe this isn't necessary with new video monitors. . .not sure

As far as setting up bars and seeing garbage in the program itself, I used to work at a TV station, and yes, you get all kinds of stuff. Sometimes the bars actually have something to do with the levels of the program/commercial/promo/etc., sometimes it's all way off. Depends who put it on tape/created the digital file and if they know what they're doing. We would always calibrate to bars, and then spot check long form stuff (sitcoms, drama, whatever) for correct levels in video and audio, and then watch in their entirely commercials and promos when making dubs for air. Good times.

Marty Welk January 6th, 2010 06:20 AM

"Camera bars SHOULD always be consistent, despite anything you may do to camera settings. "

yes, that would be ok. i was just referring back to analog days, when adjusting genlocks and camera matching, wire length and all, the bars themselves changed along WITH the cameras signal, for many items. SC Phase , Hphase , color loss, video loss, and a bunch of other stuff.
So with Camera matching the color bars DID follow lots of cameras own signal, if the signal at the other end had the color bars all wrong, then the camera signal at that end of the wire, was all wrong too.

now with fully digital stuff, the color bars are the same after a Wire Transport.
but with the loads of extra internal camera processing/adjusting, the cameras signal can be all over the place. Some of it being a reversal of how they would be matched up.

take an example of DIGITAL feeding 2 SDI/HD signals into a Switcher, color bars from some signal generator will Match , and it wont mean anything at all :-) Split screen them and they both will be the same. the cameras picture could still be processed completly differentally.

With Old Analog and genlocking matching the color bars, was a requirement, i could use the 2 cameras color bars Split screened , and get a good start to matching the cameras. The cameras internal processing was not as extensive as the new digital processing is.

so with analog and all its faults, the color bars were an essential item to genlocked and wired color matching, couldnt live without it.


All times are GMT -6. The time now is 10:09 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network