DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   What Happens in Vegas... (https://www.dvinfo.net/forum/what-happens-vegas/)
-   -   Graphics Card for Vegas 10? (https://www.dvinfo.net/forum/what-happens-vegas/489611-graphics-card-vegas-10-a.html)

Chris Nicholson January 1st, 2011 11:36 PM

Graphics Card for Vegas 10?
 
So from searching earlier threads it looks like the Vegas 10 CUDA GPU utilization
(not sure if this is the right technical term) is only at about 15-20%.

Unfortunately I'm having some computer problems and ended up blowing my graphics card today (long story)

I have to shop for a replacement and am wondering what card would be "good enough" I don't want some salesperson to talk me into getting more than I need if Vegas 10 has only limited use of the GPU.

My system is an I7 920 with 6gb of RAM. My old card was a GEForce 9600 with 1.5gb of RAM.

What cards are people using? Should I just get an OK card and spend the extra money on more RAM?

I'm not a gamer and the primary use of this computer is for editing, so I really don't need graphics power for anything else other than perhaps editing stills.

Adam Stanislav January 2nd, 2011 01:24 AM

Personally, I just got a cheap Zotac card, Newegg.com - ZOTAC ZT-40602-10L GeForce GT 430 (Fermi) 1GB 128-bit DDR3 PCI Express 2.0 x16 HDCP Ready Video Card, which happens to have a CUDA chip that was just released a few months ago. I got it because it is a CUDA Fermi card, supports 3D, has the HDMI 1.4a port in addition to a dual DVI port and a DisplayPort, is inexpensive, needs only about 35 W of power, and only requires one slot in the computer. Most cards these days require two slots, burn power like crazy, and are expensive because card manufacturers seem to think the only reason people have computers is to play games on them.

Ironically, for video editing we do not need as great a video card as for game playing.

Leslie Wand January 2nd, 2011 01:37 AM

i picked up a cheapo gts250 1gb - it's got dual dvi for dual display. probably more than i need, but it was cheap - maybe not as cheap as adams, but i feel better with the dual dvi ports.....

Chris Nicholson January 2nd, 2011 11:04 AM

CUDA Processors
 
Interesting info. I notice that Adam's graphics card has 96 CUDA Processors, and Leslie's has 128.

Does anyone know whether there is a limit on the number of CUDA processors that Vegas can use at once, or is it always more is better?

Adam Stanislav January 2nd, 2011 02:22 PM

Quote:

Originally Posted by Chris Nicholson (Post 1603755)
Does anyone know whether there is a limit on the number of CUDA processors that Vegas can use at once, or is it always more is better?

It is your video card that uses so many CUDA processors at the same time, not Vegas. That is, Vegas does (or I should say it should, since I do not have the Vegas source code to examine) just copy the entire frame to the video memory and then the video card uses all of its CUDA cores for a portion of the image, then repeats on the next portion of the image, and so on until the entire image is processed.

As to whether more is always better, it is, providing they all run at the same speed (my card runs at 1400 MHz, the GTS 250 at 1836 MHz) and use the sam CUDA architecture. If they run at different speeds (which they do) or are built with different hardware architectures (which they are), you would need to take that into consideration.

Different CUDA cards come with completely different architectures. My card use version 2.1 (Fermi), while the GTS 250, as far as I can tell, uses the old version 1.x architecture, which has much less capability. And depending on what Vegas asks it to do, it may take more time even though on paper it is faster. Or it may not take more time, again, depending on what it is asked to do.

My card supports DirectX 11, the 250 supports DirectX 10. Mine is the low end of the latest. The 250 is a higher end but of the older CUDA. Very hard to compare. Apples and oranges, really.

Older software does not even support the 2.x architecture of CUDA because it is a recent development, and the 1.x and 2.x architectures use a different binary code (kind of like the old Motorola and Intel microprocessors could not run each other's code). Sony has not mentioned which architecture they support, but since Vegas 10 is recent, presumably it should support both.

When nVidia came with CUDA, they just offered a CUDA compiler and told everyone to compile everything. These days, they offer a new compiler and are telling everyone to compile for both, the old architecture and the new architecture, and also to create a PTX version, which is a text file, so when they come up with version 3.x, current software will still work with it as the CUDA driver will compile the PTX code, into whatever binaries it will use, on the fly.

All in all, the only way to tell which is faster would be to run some test code on both and see which one runs it faster. But as with all benchmarks, it would only tell you how each fares with that particular benchmark, not how it would work with Vegas.

I am currently working on some new software. Originally, I was going to do what nVidia wants us to do, i.e., to compile for both architectures. But the more I am getting into it, the more I see that many of the new features do not exist in CUDA 1.x, so I would actually have to write some of my code twice. And since I usually release my software for free, so I do not really care to sell many copies, I am now leaning in the direction of just supporting CUDA 2.x and if you only have an old card, too bad, do not use my software or get a new video card.

Now, commercial developers are not going to do that, at least not right now, because it would affect their bottom line. But eventually, they will start dropping support for CUDA 1.x just like they are dropping the support for Windows XP (not to mention Windows 95 let alone Windows 3.1).

So, I personally like to future-proof my hardware purchases and opted for the card I got. There of course are faster CUDA 2.1 cards with more cores available, so if you want more speed, look at them. Or if not CUDA 2.1, at least CUDA 2.0, since it is the number before the dot that makes the two architectures mutually incompatible and separates the old from the new.

If you feel comfortable with tech talk, this document (PDF) describes the differences between different versions of CUDA from a programmer's perspective. Among other things, CUDA 2.x supports 64-bit math, CUDA 1.x does not (the CUDA compiler will, however, emulate 64 bits in 32-bit code, meaning the math will be done but will be considerably slower). More importantly, many of the things both support are "close enough" in 1.x but precise in 2.x. Does it matter in video editing? Hard to tell.

Chris Nicholson January 2nd, 2011 05:26 PM

So I decided to go with this one:

Micro Center - EVGA 01G-P3-1450-TR GeForce GTS 450 FPB (Free Performance Boost) 1024MB GDDR5 PCIe 2.0 x16 Video Card 01G-P3-1450-TR

Great price after the rebate, and while not cutting edge it appears to be the new architecture.

I like the idea that this won't draw a ton more power than my old graphics card yet should make at least some improvement in speed with 192 CUDA cores.

I won't have it in my machine until later tonight, but thanks to both of you for your comments and advice.

Adam Stanislav January 2nd, 2011 09:07 PM

Good choice, Chris!

Jeff Harper January 3rd, 2011 08:01 AM

Chris, to make sure you are aware, the card can only help with AVC codec, nothing else...

All in all, no matter what card you buy, it will make little difference. Obviously you want the one that will fully utilize Vegas capabilties, but if you are not working with Sony AVC files, who cares.

Someone please tell me, is it rendering AVC files, or is it timeline performance the GPU helps with? I beleive it is only with rendering, correct?


All times are GMT -6. The time now is 12:00 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network