DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Adobe Creative Suite (https://www.dvinfo.net/forum/adobe-creative-suite/)
-   -   nVidia GeForce vs Quadro on PC Creative Suite Systems (https://www.dvinfo.net/forum/adobe-creative-suite/510557-nvidia-geforce-vs-quadro-pc-creative-suite-systems.html)

Pete Bauer September 7th, 2012 02:12 PM

nVidia GeForce vs Quadro on PC Creative Suite Systems
 
NOTE: This thread was split out as a new topic from http://www.dvinfo.net/forum/adobe-cr...ml#post1752091

Quote:

Originally Posted by John Kemper (Post 1751842)
I think only the quadro series by Nvidia is listed under the supported cards for the MPE. However, any Nvidia card with CUDA cores can run it, they just have to be added to the list. I'm currently using a new GTX 670 that way out performs my old Quadro CX in everything but previewing 10-bit color.

John, thanks for the platform comparision. It isn't only Quadro cards on the certified list; some GeForce cards were as well, and the list of cards has expanded over time:

Mercury Playback Engine | Adobe Premiere Pro CS6 - Adobe.com

And of course the very simple text file hack that you mentioned does greatly expands the cards that will actually work with Mercury GPU acceleration, even if not officially certified.

I wasn't sure of your meaning in the comment about 10-bit color. Can you confirm for a PC that a newer GeForce card will support 10-bit color via DisplayPort and/or dual-link DVI? I may doing a build soon and might like to give GeForce at try vs the older Quadro I'm using on my main editing box.

John Kemper September 8th, 2012 11:19 AM

Re: CS6 Mac vs. Windows Difference
 
My Geforce GTX 670 can handle 10-bit color, kind of. I'll explain.

My previous graphics card was a Quadro CX. I use Speedgrade for coloring.

When working in Speedgrade, both the Quadro and the 670 work fine. When rendering out and importing a DPX frame sequence from Speedgrade to Premiere, both the Quadro and 670 work fine. However, when actually previewing the 10-bit color DPX sequence footage on the timeline within Premiere, the 670 stutters and skips the footage a bit while the Quadro plays it seamlessly. They both render out of Premiere to their final format just fine as well.

I've been told the issue is the drivers that are used. Geforce drivers are tuned towards DirectX and are meant for tearing through games, not video editing. Quadro drivers are tuned more towards OpenGL and OpenCL which work well for editing video, but chug a bit if you try to play a modern game on them.

Unfortunately, due to the market separation, it is unlikely that Nvidia will come out with a Geforce price pointed video card that is more tuned to OpenGL and OpenCL than DirectX.

Pete Bauer September 8th, 2012 03:27 PM

Re: CS6 Mac vs. Windows Difference
 
John, thanks again for that insight, although I got more than I bargained for.

It is a little surprising but much appreciated to get a reliable end-user report from someone using both types of cards and finding the GeForce giving degraded performance vs an older Quadro. Granted, yours is a single report but it does kind of call the question.

I was originally just wondering about the 10 bit color, since I have a Quadro FX4800 card (perfectly serviceable but maybe getting a bit long in the tooth by now) and bothered to buy IPS monitors to make use of its 10-bit color.

Some very knowledgeable folks have advocated GeForce over Quadro because of the assumption that more cores for lower cost is better bang for buck, and if it ain't crashing the drivers don't really matter, etc, etc. Not having heard complaints from the GeForce user base and assuming that a much cheaper GeForce would outperform on the timeline and still give me the 10 bit color, I was thinking about trying a newer GeForce in an upcoming build. But your experience makes me hesitate.

Harm Millaard September 9th, 2012 02:57 AM

Re: nVidia GeForce vs Quadro on PC Creative Suite Systems
 
John,

Your remark may be interpreted in the wrong way.

Output to a monitor is 8 bit for all GeForce cards, including the GTX 680.
Output to a monitor is either 8 or 10 bit for all Quadro cards, depending on the port.

Only if you have a 10 bit monitor, like a Eizo Colorgraphic, HP Dreamcolor or Dell PremierColor, does it make sense to consider a Quadro. The bulk of even IPS panels are 8 bit, so in those cases there is no benefit from a Quadro card. On the contrary, Quadro's are limited to two monitors max, while the GTX 6xx can handle 4 monitors.

For editing with PR the MPE performance is largely dependent on the memory bandwidth of the video card.

The GTX 680 has a memory bandwidth of 192.25 GB/s, a Quadro 4000 has 89.6 GB/s, a Quadro 5000 has 120 GB/s and the Quadro 6000 has 144 GB/s. So the performance gain for MPE in PR is best served with a GTX 680.

I have stated this on a site about a new system I'm building and documenting:

Video card

The much touted Maximus solution, at least by Adobe, is an utter waste of money, because it requires a very expensive Quadro card plus a Tesla C2075 card, that is even slower than a simple two generations old GTX 470, for the simple reason that it lacks CUDA cores and memory bandwidth to make it faster. We have several Maximus solutions in the current PPBM5 benchmark and despite prices up to € 6000 for the Quadro 6000 plus a Tesla C2075, these solutions are easily outperformed by many GTX 470/480/570/580/670 and 680 cards, that cost only a fraction. The only thing impressive about a Maximus solution is the price. The only reason to opt for a Maximus solution, despite the cost, is if you absolutely need 10 bit output to your very expensive 10 bit monitors, and then you only get quality, not performance.

What determines video card performance for PR?

Based on information currently available, it is not so much the number of shaders or CUDA cores. If that were the case, all Kepler cards would easily leave Fermi and older cards in the dust with about three times the number of CUDA cores, but that is not the case. The determining factor is memory bandwidth and that explains why the Kepler range is only slightly faster than Fermi and why Kepler is significantly faster than all Quadro cards.

Hope this clears things up a bit.

Roger Averdahl September 9th, 2012 03:05 AM

Re: nVidia GeForce vs Quadro on PC Creative Suite Systems
 
AFAIK, no GeForce cards can output 10-bit to a preview monitor, only 8-bit while the Quadro cards can output 10-bit color to a preview monitor.

GeForce = If you don't need 10-bit preview.
Quadro = If you need 10-bit preview.

Both card can handle 10-bit video, but only the Quadro gives you 10-bit output.

EDIT: Harm beat me...

Pete Bauer September 9th, 2012 06:57 AM

Re: nVidia GeForce vs Quadro on PC Creative Suite Systems
 
Thanks for the additional info, gents. For both stills and video, I really want to stay with 10-bit output so at least for me that settles the issue.

Even if my mid-cost IPS monitors actually display in 8-bit, the visual difference between them and a standard business/consumer display, both driven by the same Quadro card, is huge. It may be that the difference is more due to the better monitors than the bit depth of the card output, but I'd rather feed them the best signal my software and computer can provide.

I think the just-announced Kepler Quadro K5000 will be able to drive up to 4 displays, although it'll lighten the wallet. Will be interested to see if the card's performance for PPro will measure up to expectations.


All times are GMT -6. The time now is 12:55 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network