View Full Version : Replacment Monitors
January 11th, 2007, 08:02 AM
I'm about to replace 2 21" CRT monitors with 2 flat panels. These are for my editing system, and I'm running 2560X1024 - and that works just fine, although I could go up a notch if possible.
My question is: I see 20 to 21" flat panels anywhere from $200 to $500 each, and they have similar specs as far as resolution and contrast. Is this a case of you get what you pay for, or do the lower cost versions work just as good as the high price spread?
Anyone care to share the goods and bads about purchasing these kinds of monitors?
Money saved on monitors means better video card and more memory.
Thanks for taking the time to read and reply.
January 11th, 2007, 10:10 PM
Wide screen LCD's are good for layout and having more space for your editing apps. Unless you're willing to spend $1500 or more, they suck at judging color, and viewing interlaced/legacy content. Only the most newest games will take advantage and have wide-screen aspect settings.
If you're doing color correction and/or print work that requires color accuracy, then you should look into CRT's. CRT's are being phased out, and are being replaced by color accurate LCD technology, but at a price.
For home use, I only bought one LCD for layout and a pleasing view, and kept one of my pro CRTs for judging color and print work.
LCDs are hard to calibrate, and you'll need a huey or spider to get them close. If you're going all LCD's you'll still care about color if you're printing family photo's, etc, and wonder why they never print correctly. My wife, who is the 'family photographer' and used Paint Shop Pro, kept bugging me about print matching, and fixing that worthless printer. Soon as I hooked up a calibrated CRT, the printer was 'fixed'.
January 12th, 2007, 07:25 AM
Thanks Peter, good discussion.
I'm interested in the LCD's (plasma's) to provide editing space. I have a JVC pro studio monitor (CRT) that I intend to keep to judge the final output from the editor.
I think it would be nice to have a large widescreen, but the price for one of suitable resolution seems prohibitive. Two 20 or 21" flat panels will give the same resolution for about 1/2 the price. Now, the question is, how much do I need to spend for each smaller screen - there is a wide range of prices for similar specs. I'm not in a position where I can look at a lot of examples so should I go for a "name brand" or buy the inexpensive ones?
Anyone else make this decision? How did it turn out?
Thanks as always,
January 12th, 2007, 08:10 AM
they suck at ... viewing interlaced/legacy content.
Would you mind explaining this in more detail? I've heard it before, but it's never made any sense to me, especially not having any experience with LCDs. CRT computer monitors are usually progressive displays, and my understanding is that unless the entire screen is switched to an interlaced mode--which I know most of them support--you can only see progressive images. Even interlaced video files are played two fields at a time; I should think it would technically be possible to only draw one field at a time, and fill in the other lines with black pixels, but that would mean drawing 59.94 complete pictures per second, in addition to the rest of the user interface, which I've been told most systems can't do.
How exactly are LCDs worse in this respect?
January 13th, 2007, 12:35 PM
Lew, don't get a plasma (and don't confuse plasma with LCD, they are different) for editing, they suffer from image burn if you leave still images up for more than 5minutes or more. LCD's have slower response time, and few have decent deinterlace chips. LCD's are getting better in color and with quicker response times. They just can't match CRT's for a decent price point -yet. So keeping your CRT for a little longer won't hurt.
What a good sized wide-screen LCD will do is provide a more decent layout of your editing app (for longer viewable timelines), and for doing full page spreads for layouts (for print work).
If you have a decend dual head graphics card (I have quadro FX's), and depending on your editing app, you could pass the video to the CRT as overlay via your editing app. This is assignable by your graphics card. For instance, Nvidia cards will allow full screen video playback onto the second monitor. Play any video clip, from the web, on your PC or within your editor and the graphics card will temporily switch your second monitor into a full screen viewer.
Premiere pro 2 has a similiar feature to allow scrubbing and playing the timeline as full screen onto the second monitor via a dual head card as well.
I bit the bullet and purchased a Gateway 22" PC HD display for $400. I'm tempted to hand this over to my son and get the 24" model. However, I am pleased with the wide aspect in terms of editing, and confident in checking video playback and colors on my CRT.
Robert. here's some research on the subject:
January 13th, 2007, 12:54 PM
I'm familiar with the concept of interlacing, and I believe I understand the problem when sending an LCD a signal that is interlaced, but we're not talking about using LCDs as production monitors here, we're talking about using them as computer monitors.
The Windows or Mac interface, with any applications that are open, including any video clips that are playing, is always progressive, as far as I know. The signal coming out of the VGA or DVI connection on the graphics card is progressive unless the system's display settings are switched to an interlaced mode. Applications, whether they're media players, video editors, compositors, or anything else, are always drawn progressively; it's my understanding that Windows Media Player and Quicktime Player, for example, draw two fields on screen at a time, not one interlaced field after another. The video itself may be interlaced, but the display of said video is not.
Even in something like Combustion, where I can elect to draw one field at a time, one after the other, the program accomplishes this by simply filling in the missing field with black pixels, thus creating a progressive image. This image is drawn inside of the program interface, which sits on the Windows desktop, and it's all combined by your graphics card into one big progressive image which is subsequently sent to the monitor. CRT or LCD, the result should be exactly the same.
January 13th, 2007, 07:31 PM
I it's my understanding that Windows Media Player and Quicktime Player, for example, draw two fields on screen at a time, not one interlaced field after another. The video itself may be interlaced, but the display of said video is not.
This image is drawn inside of the program interface, which sits on the Windows desktop, and it's all combined by your graphics card into one big progressive image which is subsequently sent to the monitor. CRT or LCD, the result should be exactly the same.
If I convert interlaced video to another codec in QT that supports interlaced output I will get interlaced playback out of the QT player the same way I get interlaced playback on my ACD when playing back say 60i HDV footage from the canvas or full screen preview. The video card or system doesn't do any field blending rather it plays each interlaced field back as a frame which can lead to some pretty heavy comb type artifacts espcially if viewing individual frames. I believe these aren't as obvious on a CRT due to their normally lower resolution and the way the image is scanned to the screen.
But for normal screen operations or progressive video (anything non interlaced) I think you are right.
January 13th, 2007, 08:02 PM
All right, nevermind. I'm too confused now, and I can't figure out how to explain what it is I'm thinking; I'll just have to wait 'til I see some LCDs firsthand to get a better idea of what exactly the problem is. Thanks for trying to explain it to me, though, I appreciate the effort.
Lew, I hope you find some monitors you're happy with, let us know how it turns out!