View Full Version : CRT monitor for color correction


Pages : [1] 2

Raymond Jebb
January 5th, 2010, 09:31 PM
Hello guys, I just bought JVC H150CG monitor to work on a post-production of my short film. This is the link to JVC page: JVC Professional Technical Description page (http://pro.jvc.com/prof/attributes/tech_desc.jsp?model_id=MDL101461&feature_id=02)


This monitor has BNC inputs and the previous owner told me that he used to connect his computer via VGA-to-BNC to this monitor. I have a MACBOOk with Mini-DVI connection...

I have two questions:

1. Is connecting via VGA-BNC or DVI-BNC the best possible solution for color fidelity, etc?

2. I couldn't find any Mini-DVI to BNC adapters, so I'm assuming I can connect via Mini-DVI to DVI and then attach DVI to BNC cable. Would this whole set-up work and if so would it be acceptable quality for color grading?

Many thanks

Marty Welk
January 6th, 2010, 12:35 AM
ohh i am soo confused.
from what i would see if thinking of getting this monitor for HD work.
you have the normal Video And S type inputs.
then with added board thing, you have ANALOG RGB or COMPONENT

on the computer side:
DVI outputs Can be available in Both analog and Digital but not ALL DVI outputs out of computers will do the analog, does a Mini DVI have the analog components or the adpatiation TO analog VGA RGB?

Which leads to this
That isnt a HD monitor, Sure it has great horizontal resolution, but computer OR Camera, or board from the computer or Adapted signal, this is a SD monitor with 525 NTSC lines and 625Pal lines.
from what i can tell, and i have stuff like this, you will not be running some magic HD RGB or HD SDI or HD anything to it. Although you can run a SD signal to it, adapt a HD signal into it, get a still wonderfull picture with it, still use it for many things.

now that doesnt make it Useless at all, but i bet that you are going to be dissapointed , mabey for no real reason that you wont be replacing some high faluting "PRO" hd monitor costing 10 times as much with it.

Marty Welk
January 6th, 2010, 12:42 AM
now let me step out on a "oh you bad and shouldnt do that" limb.
What the heck do you need a professional CRT monitor to Color grade a film/video that will not be played on a Professional monitor anywhere :-) cept in a broadcast station.

secondly , why would you revert back to the excelllent color and contrast reproduction of a CRT for color grading, WHEN the end user is going to be using some rotten LCD screen Thing , that cant even get the full colors that a CRT can, even the light behind the LCD doesnt even have a full spectrum :-)

The projector that will project your film at various locations, will be DLP, LCD, 3X LCD , some new lazer thing, A OLED, or Plasma Screen. probably Not a 3X RGB CRT projector. yes it is likly to be a Digital projector, with all the problems presented with digital.

I am not AT ALL saying you made a huge irreperable error, and now have a completly useless professional monitor, as you might deem that i am saying.
but i will say that even if you use a Wonderfully full color CRT monitor with all its vast capabilities, you still need to know what it will look like on your average LCD or Digital type of device.

Perrone Ford
January 6th, 2010, 12:49 AM
I use an SD monitor for color correction all the time. It's not idea, but it's just fine other than being small.

Computer based video (anything coming out of the DVI) has a different color space than what would come out via firewire, or analog video output. So forget getting a DVI -> analog adapter. I don't know what NLE you are using, but you have a few options on how to make this work for you.

Essentially you need to get a broadcast color level signal out of your machine, and into that monitor. You can do this in one of two ways.

1. Use firewire. If you choose this route, you will a firewire connection to your computer, and a device that can translate that to an analog signal. I used my DV deck for this. it works great, but might be an expensive solution for many home users. Though I imagine the prices are falling rapidly these days. Another alternative is to use a camera. You connect the camera to the computer via firewire, and the camera to the monitor via S-Video or component. The drawback of this is obviously that you will be using a camera and it's going to be on a lot. darn near any camera should do the trick, so maybe something on Ebay would work.

2. Use a dedicated video board. There are a number of these around, and SD based ones are probably pretty cheap. Essentially you install it in your PC, connect it directly to your monitor, and then tell your NLE to send output through that card. A company called Blackmagic makes numerous cards for this purpose and they are excellent. You can even get combo SD and HD cards so that should you decide to move up to HD monitoring, you can keep the same card.

Again, there is absolutely nothing wrong with using your SD monitor for your work. I do it all the time and the colors stay true. I am then able to burn BluRay's with confidence, and I check my BluRays against the numerous LCDs and Plasmas at the office.

Good luck with your new monitor.

Perrone Ford
January 6th, 2010, 12:54 AM
I am not AT ALL saying you made a huge irreperable error, and now have a completly useless professional monitor, as you might deem that i am saying.
but i will say that even if you use a Wonderfully full color CRT monitor with all its vast capabilities, you still need to know what it will look like on your average LCD or Digital type of device.

A calibrated, professional monitor is a known quantity. It allows the person creating the video to have an accurate picture of what the colors look like. And it's repeatable time and time again. Yes, the work will look different on a Plasma, an LCD, an LED, or a projector. Home users do not calibrate their TVs and so the look is very variable.

But go to any TV station, film studio, or anywhere else professionals work with video, and you'll see exactly the same thing. A professional monitor, professionally calibrated, so that the person working get get repeatable and accurate results. What happens to the signal when it hit's the users home is another matter. But it is our duty as professionals to provide a broadcast legal signal that give the best color, contrast, and fidelity we can.

Marty Welk
January 6th, 2010, 12:56 AM
Sure, but when i analyse my own signal on a CRT it is a world of differance from the LCDs out there.
i can pinch a signal into looking good on all the LCD screens, and it Still looks great on the CRT, reverse that, and i can make a signal that looks great on the CRT, and it is ruined on various LCDs.
without seeing what i am going to look like at the end user, i might as well put a blindfold on. probably a $5000 one too :-)

Perrone Ford
January 6th, 2010, 01:13 AM
So, if you plan to deliver to 5000 BluRay users, 200 DVD users, untold numbers of folks online at Youtube, Vimeo, and other places, what do you calibrate to?

Do you calibrate it to the couple of LCDs you have? Or do you calibrate to a known standard? The same standard the local TV broadcasters are using. The same standard ESPN is using. The same standard HBO is using. If you calibrate to the same standard as everything else delivered to that TV, then your material is going to look like the rest of the material going to that TV.

Let's assume for a moment that you calibrate it to your uncalibrated LCDs around the house. And let's say that all 3 of the LCDs have the reds turned down because you don't like too much red. So you calibrate your video to that. It gets delivered to someone who has their reds set at a normal level. Now it appears that everyone has wildly red skin, but the rest of what they see on broadcast doesn't look that way.

My advice to the OP is to calibrate to the same standard as the broadcasters. If someone has their TV set up wacky, then your video is going to look just as wacky as the evening news, and in exactly the same way.

Marty Welk
January 6th, 2010, 01:22 AM
LCDs will White Clip, Black Clip, Look like 16bit color when it comes to smooth long gradients. Not to mention the cute "AUTO" changing thing digital expansion and other digital goodies, that when ON react COMPLETLY different to a change in the signal.

Sure there is Broadcast specs, but the broadcasters themselves arent jamming thier signals up into white clip on a LCD are they?

Why would i set monitors that i will compare what the average user will be seeing, to Red , or Green or anything else unless i want to see what happens when THEY do that.

These Digital contraptions dont have a dial wheel for phase anymore, they come with many preset that users will decide on, then Mabey adjust from there for thier preferances.
All i have to do is have a few of them on Standard, switch them up to vivid, try the sports mode, the cinema mode, and see what of my "Broadcast spec" picture doesnt any longer fit into "Digital reality of today".

If thier Standard in thier opinion is offset from the manufactures standards are set, well then they will still get the preferance they selected. all i need to know is when THEY set thier LCD screens that way, that MY signal still looks the way i intended.

there are "legal" broadcast signals, and there are signals that look good on many LCDs, and THEY are still legal

Perrone Ford
January 6th, 2010, 01:28 AM
OK Raymond,

There you go. Two varying opinions. Choose one of those, or maybe a different one altogether.

Best of luck.

Josh Bass
January 6th, 2010, 02:50 AM
Are you guys saying that an SD CRT is ok for HD color correction? If I ever upgrade my cam to something HD I'll want to accurately be able to color correct it . . .you're saying this would all work out?

Marty Welk
January 6th, 2010, 03:32 AM
You certainly could see the HD color on a SD monitor :-) even if you cant see the HD resolution. If you can get the signal TO it, and to it without it changing somewhere to achieve that. like in one more D/A converter thing or some Codec item that makes changes itself trying to look good.

Josh Bass
January 6th, 2010, 03:43 AM
That's what I'm wondering. . .does doing whatever you need to do to downconvert (let's assume typical firewire to D/A converter (a little camcorder in this case) out to monitor chain) to downconvert the HD signal alter it?

I've worked recently on shoots where we've used an old school SD CRT monitor with an EX-1, and for whatever reason the footage that looked fine off the CRT came out excessively warm in reality. This may have been due our misunderstanding of how to calibrate the EX1's ARIB bars at the time, though.

Resolution would be nice, but you can monitor that in your NLE. If I could get accurate color on an SD monitor and not spend $8000 for an HD monitor that's equally accurate that'd be sweet. This is all purely theoretical right now, of course.

Marty Welk
January 6th, 2010, 03:55 AM
Bars , oh my, another thing i dont understand , and shouldnt even discuss around real professionals . Naughty naughty.

Bars - a signal generator in the camera, Usefull to Match Analog devices over long wires and such, and possibly to calibrate a monitor that your viewing with. Completly useless for analising things like compression tossing colors out left and right. bars, without gradient color and Luma sweeps are just cartoons, which work out great if your filming the simpsons :-)

Internal Camera signal Processing - is it tied to the bars? do the bars adjust when you adjust all that Digital Chip to output signal processing? Set up a "Cine" signal processing, and the bars then serve what purpose in relation to the processed signal? Do the Bars and the Video match in black level when you changed the black level in the camera to some Cine version of it? Does the color match when your cine version is some greenish thing? Does the level of colors match when the cine version of the processing is now putting the color down a bit (even if it is now normal)?

Digital signal - doesnt change Phase or color depth or video level , because it is not analog, either you get a whole Digital signal at the other end of a wire , or you dont get anything, or nasty blocks and artifacts (nothing usefull). Analog had all that stuff going to heck with long length cables, it had to be adjusted to make it "right" again.

soo bars, what exactally are they for now :-) err setting the monitor with a signal generator from the camera, on a monitor that is Calibrated ?? ooookaaay , that shouldnt take to long.
Then switching back to the camera which can be set any number of ways that arent any longer a "standard" .

For checking problems with non-digital signals over wire? like Component/composite/S wherin if you were having problems pipeing a long analog signal, you would now use digital instead? so why would you need to check a digital signal for a change, it is either there or it isnt?

Ohh i am sooo confused. This analog to digital conversion thing, its all just in my mind, it isnt really happening like this is it? it cant be , it must be a bad nightmare.

I dont know if you have ever seen this, but the Broadcast station asks the editor to put "Bars" on the front of thier video, so they can Easily adjust the color/luma/phase for the video.
so the Editor puts a set of perfect bars on front of a horribly adjusted video :-) the broadcast station which required the bars, now adjust to the bars, the rest of the program goes out in its horribly adjusted state :-) lol .

of course professionals know the correct way that this is all supposed to be, i am just indicating how the "Machine" is only as usefull as the human running it again.

Josh Bass
January 6th, 2010, 06:12 AM
Camera bars SHOULD always be consistent, despite anything you may do to camera settings. . .those colors should always be at a certain luma, chroma, etc. So regardless of cam settings, if you adjust your monitor to the cam's bars, what you see should be what you get.

I'm not sure about that no phase or video level in digital thing, that sounds wrong to me. There is no phase in HD, or at least no phase control on the HD LCD Monitors I've used (and from what I understand the ARIB Bars - which are HD NTSC color bars - the EX1 generates, don't have a way to judge phase adjustment).

Don't know about some of your other queries.

D/A converter is how you get something coming out of your NLE to show up on a CRT monitor. You go firewire out of computer, into D/A device, and then some kind of video cable (BnC, RCA, svideo) out to your monitor. That give you an "honest" feed, vs taking VGA out or some other craziness. Maybe this isn't necessary with new video monitors. . .not sure

As far as setting up bars and seeing garbage in the program itself, I used to work at a TV station, and yes, you get all kinds of stuff. Sometimes the bars actually have something to do with the levels of the program/commercial/promo/etc., sometimes it's all way off. Depends who put it on tape/created the digital file and if they know what they're doing. We would always calibrate to bars, and then spot check long form stuff (sitcoms, drama, whatever) for correct levels in video and audio, and then watch in their entirely commercials and promos when making dubs for air. Good times.

Marty Welk
January 6th, 2010, 06:20 AM
"Camera bars SHOULD always be consistent, despite anything you may do to camera settings. "

yes, that would be ok. i was just referring back to analog days, when adjusting genlocks and camera matching, wire length and all, the bars themselves changed along WITH the cameras signal, for many items. SC Phase , Hphase , color loss, video loss, and a bunch of other stuff.
So with Camera matching the color bars DID follow lots of cameras own signal, if the signal at the other end had the color bars all wrong, then the camera signal at that end of the wire, was all wrong too.

now with fully digital stuff, the color bars are the same after a Wire Transport.
but with the loads of extra internal camera processing/adjusting, the cameras signal can be all over the place. Some of it being a reversal of how they would be matched up.

take an example of DIGITAL feeding 2 SDI/HD signals into a Switcher, color bars from some signal generator will Match , and it wont mean anything at all :-) Split screen them and they both will be the same. the cameras picture could still be processed completly differentally.

With Old Analog and genlocking matching the color bars, was a requirement, i could use the 2 cameras color bars Split screened , and get a good start to matching the cameras. The cameras internal processing was not as extensive as the new digital processing is.

so with analog and all its faults, the color bars were an essential item to genlocked and wired color matching, couldnt live without it.

Josh Bass
January 6th, 2010, 06:29 AM
Ah, but that's what your scopes are for.

Marty Welk
January 6th, 2010, 06:53 AM
Sort of what i am saying, is the Color bars are not USED as much for one of the major purposes they once had with analog signals.
and they also dont reflect the changes made in the digital arena to be totally effective.
They used to Display every Problem, issue, and incorrectness of an analog signal.
But
Digital doesnt have the Same problems analog has, it has a whole new array of issues and problems :-) called compression , bit processing, data fudging, cheap digital tricks. The old analog color bars dont do anything to show those problems. They wont very well show codec problems, artifacts, blocking, fringing, banding, noise, and all the new digital disasters that can occur. They wont show ALL the issues and problems with digital display devices either.

Marty Welk
January 6th, 2010, 07:00 AM
Ah, but that's what your scopes are for.

Yes and i Luv those digital scopes in the software , but really if your signal generated non-moving color bars , sent digitally through a wire, and put into a computer dont already Match , and look really pretty on a scope display, then something must really be messed up :-)

Perrone Ford
January 6th, 2010, 08:43 AM
But Digital doesnt have the Same problems analog has, it has a whole new array of issues and problems :-) called compression , bit processing, data fudging, cheap digital tricks. The old analog color bars dont do anything to show those problems. They wont very well show codec problems, artifacts, blocking, fringing, banding, noise, and all the new digital disasters that can occur. They wont show ALL the issues and problems with digital display devices either.

I'm sorry, this is just wrong. You have all the same problems as before, they just manifest themselves differently. And if you think color bars don't have a place, then do the following:

1. Lay a set of bars onto a recording at the camera and record 1 minute of video with it.
2. Drop that on the timeline and convert to an editing format like ProRes, or DnxHD, or whatever.
3. Edit the piece
4. Convert that to a broadcast format
5. Match the broadcast format final to the original signal and see what's happened to those bars.

They still have a place, and they can still tell you what's happening to your signal along the chain of processing.

Marty Welk
January 6th, 2010, 09:38 AM
It is a concept that is yet to be understood.
i did not say they had No value, never said that.
i said they need to reflect the changes that digital has thrown at us, even if the original still exists.
full uncompressed analog signals didnt fall apart when you have massive pixel/color changes all over the screen, even if they fell apart all the rest of the time. :-)

if i want to see what is happening thanks to De-compressing some 20-1 compression scheme into some OTHER compression scheme, then de-compressing and re-compressing it back again into one more lossy high compression scheme, If i am going to see what is happening when my colors are tossed out left and right, when there is much movement on the screen, then i am going to need more than Blocks (ha ha blocks that is how it is compressed) of solid colors and a minor grey scale.

i am not trying to take away anybodys color bars, the color bars like you say can still show changes, changes that really shouldnt even be occuring with proper digital :-) unless for some reason your processing the color bars themselves.
but they barely touch what is really happening to the signal during the processing.
giving cruddy digital compressions and displays a free ride.

Perrone Ford
January 6th, 2010, 09:39 AM
Sort of what i am saying, is the Color bars are not USED as much for one of the major purposes they once had with analog signals.
and they also dont reflect the changes made in the digital arena to be totally effective.
They used to Display every Problem, issue, and incorrectness of an analog signal.
But
Digital doesnt have the Same problems analog has, it has a whole new array of issues and problems :-) called compression , bit processing, data fudging, cheap digital tricks. The old analog color bars dont do anything to show those problems. They wont very well show codec problems, artifacts, blocking, fringing, banding, noise, and all the new digital disasters that can occur. They wont show ALL the issues and problems with digital display devices either.

Yep,

But not sure how this relates to whether or not using a CRT broadcast monitor is viable. You say it's not, I say it is. So there ya go.

Marty Welk
January 6th, 2010, 09:53 AM
Again i dont recall having hacked up a CRT display EVER.
do you read the posts? i specifically stated that the CRT has a fantastic capable picture, that LCDs do not.
my only premise is that my signal is now being processed for the purpose of display to the screen item that sits in the middle of someones living room that they paid good money for and believe that what i give them should look excellent ON IT. The 2 leftover CRTs moved to the bedrooms wont suffer from having a signal that looks great on thier primary tv.

without seeing what that looks like, i can make the most wonderfull Calibrated CRT display that exists, and it wont do the End user watching it on a LCD any good, unless i factored that into the equasion. (which could still be done on a CRT, with scopes or anything else)

just like i think they should factor more than the usual 40 year old color bars into the equasion. Stuff changed, and I change with it.
the end user could frankly care less that i used a $5000 monitor, if when they take out thier store bought movie and put in my video, it has more LCD problems than the movie. they will just know what they see, the movie looked great, and my stuff has LCD issues.

what is so difficult about my concepts? I did not invent digital, and i think it was a disservice to Sell digital as being "perfect" being able to 1-1 every pixel on the screen, then turn around an Compress it. the act/art of compression, does not make everything magically the "Same" like they were selling digital to do.

Raymond Jebb
January 6th, 2010, 05:03 PM
Guys, thank you so much for your replies. I was just reading your replies and honestly I didn't understand 90% of it, except for Perrone Ford's first post where he talks about Firewire connection. Everything else was too tech and difficult for me. Sorry.

Yes, I do have HD footage that I would like to color correct on this monitor. I don't want to view that footage in HD resolution, but just want to see colors on a well calibrated CRT monitor so I know what the "standard" looks like. Correct me if I'm wrong, but if I have professionally color corrected material, that is broadcast standard that will give the material a "golden-middle" position. Basically, if there are million TVs that all have different qualities, settings etc broadcast-safe material would stay in the middle of a spectrum that no one knows how wide it is. I think what Perrone said really made sense to me. If it looks good on my LCD monitor, it would probably looks not so good on many other monitors because I have my monitor configured just to my taste. How many people have similar taste to me. not only that, but how many people have the same brand of monitor as I do? With a broadcast-safe material every end-user take that standard as a departure point and then adjust their image as they wish, to their taste... Am I missing something?

In the end, I don't have a lot of money, have a relatively cheap LCD monitor that I can't trust at all and can't afford an HD reference monitor and all I really want to do is to see real colors on my CRT :) Please tell me I didn't waste my $150 :))

Marty Welk
January 6th, 2010, 05:12 PM
we already Both indicated that you did not waste your money.
the CRT will give you the correct colors (which is what you wanted), and watching your stuff also on the Nasty LCD screen :-) will allow you to insure that the picture will look good on that TOO, when that is what they watch it on.

all you have to do is figure out how to get a SD Analog Signal to it.

Raymond Jebb
January 6th, 2010, 05:12 PM
Perrone, I've heard about that firewire solution as well. That's what I was intending to do before I bought this monitor. When I went to pick up the monitor the previous owner told me about connecting to the monitor directly from his computer via VGA-to-BNC... So I thought that would be an easier solution. But, I'm guessing VGA output Analog signal and my Macbook (I edit in Final Cut Studio 7) outputs digital via DVI and that is quite a difference right? On top of that, you mentioned that computer outputs different color space than the deck... Sorry, to be repetitive, just trying to really grasp whats going on...

I do have a camera that I can dedicate to this job. Canon HV20. If I use it a lot I will buy a cheaper cam in the future. This sounds like the quickest, cheapest and easiest solution for
someone like me. If I go with this solution, can I be comfortable knowing that I see good colors?

Your second solution (video board), - I looked at their web-site but I can't be really sure which product you meant specifically. I'm sorry, but would you be so kind to post a link? Blackmagic Design: Products (http://www.blackmagic-design.com/products/) I assume this solution would involve me putting some kind of card inside a computer. I have a Macbook laptop, not even a desktop so not sure if it's even possible...

In any case, your suggestion really lifted my spirits. Thank you again

Raymond Jebb
January 6th, 2010, 05:14 PM
Marty, sorry. Didn't mean to ignore you or your suggestions, but as you can see I'm not very good in tech stuff and I didn't fully understand what you meant. I just really needed a straightforward explanation how to connect, where, what, etc.

In any case, I also appreciate your input. Thank you

Josh Bass
January 6th, 2010, 05:18 PM
I don't know about $150, but the monitor I have (13" CRT, Sony PVM 14m2u) was around $400 used.

Here's the deal from way back when I got this thing, as I understand it.


Certain CRT monitors have certain phosphors that make them especially idea for color correction. There are two kinds to look for - SEMPTE C, which are the be all and end all of phosphors, and P22. The SEMPTE C monitors are much more expensive (or were) but are supposed to provide super accurate color reproduction, which means that you'll have the greatest chance of having the greatest number of people with all different kinds of tv sets, monitors, preferences, settings, etc. seeing your stuff the way it's supposed to look, or at least not to far from there. The P22s are supposed to be "good enough" for a lot of purposes. I think I remember something about monitors using SEMPTE C looking very different from how it would look on a typical TV set, but for some reason that was a good thing? Someone jump in here if I'm way off.

Personally, I use my monitor (which has P22 phosphors) to color correct my short films, a friend's feature film, and very occasionally my paid work. This is all SD stuff so far, so no experience with HD. I have seen my shorts and this friend's feature on several different projectors and TVs and the colors usually look bang on to what I saw when doing the color correction. I say usually 'cause some projectors are really bad, but maybe you have to try and accommodate those too.

Raymond Jebb
January 6th, 2010, 05:22 PM
I assume this is the card: Blackmagic Design: Intensity (http://www.blackmagic-design.com/products/intensity/) ?

Unfortunately, I don't have a desktop or a PC that would take this. Will stick with the first the firewire solution, until I can make money to buy fancy stuff :))

Perrone Ford
January 6th, 2010, 06:16 PM
Raymond,

Sounds like you understood what I said and meant perfectly. The HV20 will work just fine. When you start moving up to paid work, then upgrade your tools accordingly. And forget the DVI - Analog stuff. Also, I didn't realize you were on a laptop so forget the card.

I will ask this. Does your laptop have an s-video connection on it? I know some of my laptops have. If so, you can use this for color correction also, though it may not be as good as going to the camera if the camera has component out.

Josh Bass
January 6th, 2010, 06:31 PM
Firewire should work just fine. . .I have noticed a lot of times if you use the monitor for watching your editing, though, that audio goes out of sync.

Perrone Ford
January 6th, 2010, 06:52 PM
Firewire should work just fine. . .I have noticed a lot of times if you use the monitor for watching your editing, though, that audio goes out of sync.

Yes, and many NLEs allow you to adjust for that with a configurable delay.

Craig Parkes
January 6th, 2010, 09:03 PM
Ok, here is my take on it.

Firstly, the colour gamut of HD is not quite the same as SD, (Rec.709 vs ITU.601 colour space). This makes a difference when you grade for HD off an SD source, even on a CRT, so there will be a difference between your SD CRT and an HD CRT - however there are VERY VERY FEW HD CRT's in the world, and certainly NONE in anyone's homes, and the difference isn't that dramatic.

Ideally, you would be colour grading to BOTH an LCD/PLASMA in HD and an SD CRT. In your setup the cheapest way to do that would be to try and visually match the LCD in your laptop to a correctly setup HD LCD monitor being fed a valid REC.709 signal.

I.e take your laptop into a post place with HD monitoring that is calibrated, look at a variety of footage on your monitor and their monitor, and change your laptop display settings so that they are as similiar as possible.

Then, when colour grading on your CRT, you'll have a simultaneous reference for what the picture is going to look like on a calibrated SD CRT and very much consumer level LCD panel that is also calibrated.

This combo is probably the most cost effective solution to get you in the ball park.

With a laptop, there are four products that will give you monitoring options - the AJO IO Express (Mac and Final Cut Only), The AJA IO HD (Mac and Final Cut Only), and the Matrox MXO 2 Mini, and the Avid Mojo DX (Avid Only). With the exception of the AJA IO HD, all of these require a laptop with a PCI Express card, an will allow you to output a full HD, correct REC.709 output from your laptop via HDSDI, and I believe will also do a realtime down conversion to SD, and should output a correct SDI signal of this, as well as regular SD output.

Some also provide HDMI output, and some also provide component output for both HD and SD - so amongst that pull of hardware, and their associated software solutions, you have access to everything you need to probably monitor a correct signal.

If you are monitoring via FIREWIRE to your camera then the signal is almost certainly being processed to DV for your monitoring, in which case even if you are editing a different codec it will go through DV compression to be displayed, which means you will potentially be seeing artifacting which ISN'T on your original footage (assuming your aren't mastering back to DV.)

Raymond Jebb
January 6th, 2010, 09:49 PM
I just fetched my Canon HV20 and alas, it doesn't S-Video out... It does have Component out, but it doesn't fit any of the inputs on the back of the monitor. I'm pretty sure this monitor has Component card (I may be wrong). I'm attaching a photo of the back of the monitor and also the cables that came with my HV20. Would you be so kind to explain to a dummy what I have? :)

I'm guessing I have two choices.

1. To buy a camcorder that has S-Video out

2. Or to somehow figure out what kind of inputs my monitor has and deal with that.

S-Video vs Component - which is better?

http://i266.photobucket.com/albums/ii260/Jacque_Academia/aiptek/monitor.jpg

http://i266.photobucket.com/albums/ii260/Jacque_Academia/aiptek/monitor2.jpg

Thank you!

Raymond Jebb
January 6th, 2010, 09:52 PM
. sorry. double post below

Raymond Jebb
January 6th, 2010, 09:57 PM
Ok, here is my take on it.

Firstly, the colour gamut of HD is not quite the same as SD, (Rec.709 vs ITU.601 colour space). This makes a difference when you grade for HD off an SD source, even on a CRT, so there will be a difference between your SD CRT and an HD CRT - however there are VERY VERY FEW HD CRT's in the world, and certainly NONE in anyone's homes, and the difference isn't that dramatic.

Ideally, you would be colour grading to BOTH an LCD/PLASMA in HD and an SD CRT. In your setup the cheapest way to do that would be to try and visually match the LCD in your laptop to a correctly setup HD LCD monitor being fed a valid REC.709 signal.

I.e take your laptop into a post place with HD monitoring that is calibrated, look at a variety of footage on your monitor and their monitor, and change your laptop display settings so that they are as similiar as possible.

Then, when colour grading on your CRT, you'll have a simultaneous reference for what the picture is going to look like on a calibrated SD CRT and very much consumer level LCD panel that is also calibrated.

This combo is probably the most cost effective solution to get you in the ball park.

With a laptop, there are four products that will give you monitoring options - the AJO IO Express (Mac and Final Cut Only), The AJA IO HD (Mac and Final Cut Only), and the Matrox MXO 2 Mini, and the Avid Mojo DX (Avid Only). With the exception of the AJA IO HD, all of these require a laptop with a PCI Express card, an will allow you to output a full HD, correct REC.709 output from your laptop via HDSDI, and I believe will also do a realtime down conversion to SD, and should output a correct SDI signal of this, as well as regular SD output.

Some also provide HDMI output, and some also provide component output for both HD and SD - so amongst that pull of hardware, and their associated software solutions, you have access to everything you need to probably monitor a correct signal.

If you are monitoring via FIREWIRE to your camera then the signal is almost certainly being processed to DV for your monitoring, in which case even if you are editing a different codec it will go through DV compression to be displayed, which means you will potentially be seeing artifacting which ISN'T on your original footage (assuming your aren't mastering back to DV.)

Dear Craig,

Thanks a lot for this information. I'm certain this will be veru valuable down the road, but unfortunately right now this is (I'm sure) out of my league. on top of things I work on a MACBOOK, not even a MACBOOK PRO which would have a PCI Express card of slot. But it's very good to know the options and also to be aware of the details. But for now, for things that are not intended for major networks, but for festivals and the like I should be ok? That's all I need right now.

Comparing footage on my laptop and HD monitor is a great idea. I know someone at a film school in NYC who has access to the HD reference monitor. Thanks

Adam Gold
January 6th, 2010, 09:59 PM
S-Video vs Component - which is better?
Component. Just get some RCA to BNC connectors and you're set -- the RGB inputs are component. Just make sure you set the cam to output 480i to Component rather than 1080i.

Craig Parkes
January 6th, 2010, 11:11 PM
Dear Craig,

Thanks a lot for this information. I'm certain this will be veru valuable down the road, but unfortunately right now this is (I'm sure) out of my league. on top of things I work on a MACBOOK, not even a MACBOOK PRO which would have a PCI Express card of slot. But it's very good to know the options and also to be aware of the details. But for now, for things that are not intended for major networks, but for festivals and the like I should be ok? That's all I need right now.

Comparing footage on my laptop and HD monitor is a great idea. I know someone at a film school in NYC who has access to the HD reference monitor. Thanks

Absolutely you should be fine - for reference my first few HD music videos were graded on an iMac within Final Cut with an output using an old Sony trinitron as a second screen that I had calibrated to the colour bars on Final Cut, and we aimed for something we liked in between the look of the LCD and the Trinitron, and it proved to work pretty well for us in terms of how it looked on both a broadcast monitor and how it looked on LCD tv's.

As Adam has said all you need is RCA to BNC connectors - it's just a different sort of connector, signal wise going with component is fine.

Raymond Jebb
January 6th, 2010, 11:54 PM
That's great! I'll do that. I'm going to BH Photo tomorow.

I don't know if you noticed on the second photo, but the connection to HV20 that says "component out" doesn't look like an RCA connection, but more like some kind of firewire... There is another connection called AV/with headphone sign next to it - is that RCA? I don't know if that make a difference, but I'm guessing BH people should know...

Raymond Jebb
January 6th, 2010, 11:57 PM
Component. Just get some RCA to BNC connectors and you're set -- the RGB inputs are component. Just make sure you set the cam to output 480i to Component rather than 1080i.

Adam, may I ask why? Just curious.

HV20 has to options: 480i and 1080/480i

thank you

Adam Gold
January 7th, 2010, 12:43 AM
The three colored plugs in your second photo are just standard RCA component. I assume the other end of the cable just goes into the component out of your camcorder.

Don't use A/V -- that's composite, although with an SD monitor I guess it doesn't matter as much.

At least with the Sonys I have (monitors and cams) if you set Component Out to 1080i/480i. it won't show up on a 480 only screen. May not work the same with Canons.

Brian David Melnyk
January 7th, 2010, 01:36 AM
i think the argument is like mixing audio. using 'accurate' monitors is essential to know whether you are adjusting the frequencies that actually need adjustment, and you can actually hear how they are been adjusted. even though the end product will be played as an mp3 through an iPod or a home system that pumps up the bass and high end, at least you know that the master is a known quality that is as translatable to as many varied devices as possible. mixing through home stereo speakers for home stereo speakers only works for that set of speakers, and only in that particular room...
that said, i am CCing with a MBP and using the scopes like mad, and checking the results on a TV and other monitors... my next investment will be a Matrox MXO2 LE, i think, and a mac or dell display.

Ervin Farkas
January 7th, 2010, 07:18 AM
Why would i set monitors that i will compare what the average user will be seeing, to Red , or Green or anything else unless i want to see what happens when THEY do that. These Digital contraptions dont have a dial wheel for phase anymore, they come with many preset that users will decide on, then Mabey adjust from there for thier preferances. All i have to do is have a few of them on Standard, switch them up to vivid, try the sports mode, the cinema mode, and see what of my "Broadcast spec" picture doesnt any longer fit into "Digital reality of today".

If thier Standard in thier opinion is offset from the manufactures standards are set, well then they will still get the preferance they selected. all i need to know is when THEY set thier LCD screens that way, that MY signal still looks the way i intended. there are "legal" broadcast signals, and there are signals that look good on many LCDs, and THEY are still legal
Eventually, someone will have to tell you... let me be that bad boy. Marty, you are dead wrong. Sorry for being brutal, but you are defending a huge mistake.

No matter what the industry, there are standards, and there is custom work. What you teach is custom work. Just a very simple example: let's say you need a bed, a matress, and bedding. Manufacturers came together and set up standards - there are king, queen, full, etc sizes. So now you can walk into any store and buy with confidence... the mattress will fit in the bed, the sheets on the mattress, and so on. Just imagine what a chaos it would be without the standards.

The difference between right and wrong is called STANDARDS.

Professionals use standards.

Andy Tejral
January 7th, 2010, 07:36 AM
First, it looks like that monitor has the ability for different kinds of component video. I didn't see a switch on the back so there is probably a menu item.

Two, you may need terminators. Hard to tell from the picture, but some newer monitors have automatic termination others have a switch, others need external termination. Hard to believe that it would need external terminators but it is possible.

Also, I didn't see this answered, so, that component breakout you show in pic 2 plugs into the 'firewire' socket you mentioned (on the right, front of the camera). The real firewire socket is on the back.

Huh, I was sure that the cam had s-video but you're right. Also, remember, to get audio, you'll need to use the other A/V breakout cable.

Raymond Jebb
January 7th, 2010, 10:15 AM
Sorry, I'm confused. That component breakout on my HV20 - it seems it's not a standard RCA connection. I haven't found anything on the web resembling that kind of connection. They all look round, like those RGB connectors at the end of the cable. I can't find anywhere a connection that resembles my "firewire" connector on HV20.

Andy Tejral
January 7th, 2010, 10:26 AM
I don't know if you noticed on the second photo, but the connection to HV20 that says "component out" doesn't look like an RCA connection, but more like some kind of firewire... There is another connection called AV/with headphone sign next to it - is that RCA? I don't know if that make a difference, but I'm guessing BH people should know...

The R-G-B connectors are RCAs (or phono plug or pin plug). The other end is a semi-proprietary connector. Have no idea what it is called.

Plug that cable into the camera and the other end into the monitor (with the bnc--rca adapters).

You'll also need the A/V breakout cable for audio. (that's a 1/8" or 3.5mm connector).

Raymond Jebb
January 7th, 2010, 10:31 AM
Oh, sorry for the confusion. Now I see :) I still use the cable that I have, but from there to the monitor I need RCA to BNC. Now it dawned on me :))) Thank you

Ervin Farkas
January 7th, 2010, 10:38 AM
The small rectangular connector is not firewire, it's a Canon proprietary connector. On the other end of your cable you have standard RCA male connectors.

Here is what you need to do:

1. Connect your camera to the computer via your firewire cable.

2. Connect the proprietary video cable into the appropriate connector on the camera.

3. Buy three "RCA female to BNC male adapters" from Radioshack or similar store and connect your cable with those adapters to the monitor video inputs - top left three BNC female inputs marked RGB "in" on the picture.

4. For audio, you probably don't have line out from the camera, just headphones, and that will only yield low quality audio, so you're better off using your computer audio.

I hope this helps,

Andy Tejral
January 7th, 2010, 10:44 AM
Two, you may need terminators. Hard to tell from the picture, but some newer monitors have automatic termination others have a switch, others need external termination. Hard to believe that it would need external terminators but it is possible.


I don't know if you even saw this, but disregard. I looked up in the manual--it has auto-termination.

Raymond Jebb
January 7th, 2010, 12:08 PM
I don't know if you even saw this, but disregard. I looked up in the manual--it has auto-termination.

Hi Andy, thank you!

Raymond Jebb
January 7th, 2010, 12:09 PM
The small rectangular connector is not firewire, it's a Canon proprietary connector. On the other end of your cable you have standard RCA male connectors.

Here is what you need to do:

1. Connect your camera to the computer via your firewire cable.

2. Connect the proprietary video cable into the appropriate connector on the camera.

3. Buy three "RCA female to BNC male adapters" from Radioshack or similar store and connect your cable with those adapters to the monitor video inputs - top left three BNC female inputs marked RGB "in" on the picture.

4. For audio, you probably don't have line out from the camera, just headphones, and that will only yield low quality audio, so you're better off using your computer audio.

I hope this helps,


Thank you. I just went to Radio Schack and got the connectors. Will test everything in the evening!

Thanks everyone. I really appreciate your help!