View Full Version : Building a new SuperComputer for Vegas Pro 11!


Pages : [1] 2

Kim Olsson
May 15th, 2012, 10:29 AM
Much have changed since I used Sony Vegas 7...

My old setup for SV7, was a Core2Duo E6700 - 2.67Ghz, 4GB RAM, nVidia GeForece 7900GS, 2x80GB harddrives and 2 monitors, I believe... I payed $4000...

I haven't used Sony Vegas for 2 years. I gave my sister my old computer, so I am trying to design/build a new setup for video & foto editing...

I need some suggestion from experience people...

This is what I came up with, and this is my reasons:

¤ Intel Core i7 3930K 3.2GHz - (I could settle with 2600k, but I really want socket 2011 for 8xRAM)
¤ Corsair Cooling Hydro H100 - (Going to OC to about 4.2GHz-4.7GHz)
¤ 64GB RAM (8x8GB 1600MHz) - (Everybody talks about the RAM!)
¤ Corsair SSD Force 3 Series 120GB - (This will be the system drive)
¤ 2x2TB HDD - (This is for storing, i do have external drives too)
¤ 1000W 80+ Gold, power supply - (After reading Jeff Harper's posting =)
¤ Windows 7 Pro - (I need the pro version to handle that much RAM)

This will cost about half my old system, on about $2500

Video&foto editing isnt my professional, just memorising my loved ones... but I want good stuff....

Im from Sweden, so my English isnt the best..

Any help would appreciated. Im really confused about the graphic card.... I dont get any smarter reading this forum about it either... Would a GTX 570 or better ex. be to any good ?? Or can i save a couple of $100 being without one?

Gints Klimanis
May 15th, 2012, 11:54 AM
You are building a serious cube. I wouldn't count on overclocking a part 50% even with that water cooling. I opted for a six fan water loop to keep things quiet, and during the summer months, the 560ti CPU fan spins as 2800 RPM even when the card is doing little to nothing.

Intel is boasting performance increases for encoding, but that is for encoders using their new AVX instruction set. I wonder how long Sony will need to add support :
IntelŽ AVX - Intel® Software Network (http://software.intel.com/en-us/avx/)

Let us know if you observe a speed-up with 32GB vs 64GB RAM. If your app isn't using RAM, Windows caches files in the available memory. Given the serial, size and generally low data rate of compressed video files, it's hard to imagine the advantage of a huge amount of RAM. If you really are limited by memory bandwidth, purchasing faster (2400 MHz) RAM is a better performance move. Which motherboard are you using? On the ASUS Rampage III, I found that my motherboard chipset also needed watercooling, so it is included in the cooling loop.

As for the GFX card, the nVidia 570 is promising. I have an eVGA 560ti, but even with that, I don't notice any speed-up of my most common task : Sony Vegas decode XDCAM 35 / 50 MBit MPEG2 decode and encode with Sony AVC to MP4. CPU overclocking (Core i7 six-core 3.3 GHz 980x)

Other desktop apps such as DVD Fab make better use of the nVidia card for video encoding. I hope to hear your experiences with nVidia's huge increase in CUDA cores in the Kepler line. I'd be willing to pay more for Sony plug-ins that are optimized to use more CUDA cores and more GFX cards for a 2X+ performance increase.

Kim Olsson
May 15th, 2012, 01:19 PM
Great to see responses...

I choosed the Corsair Cooling Hydro H100, because im not really a OC guy...
So my option was also the most simple CPU cooling solution, but also the most effective. This cooling device, have really got great reviews for it effectiveness when you overclock the 2600k and 3930k.
And dont forget, I live in Sweden =) Winter temperature nov-mar is usually -4°F
to 32°F. Summertime, never above 77°F.

And for the motherboard, Im looking at the Asus P9X79, which has room for 8 RAM sticks...
And for RAM, I think I need it. When I toogle Photoshop, Vegas, with heavy projects, using and editing high-res images, high-def videos, multiple tracks, compositions, I belive it is useful...

yea, yea. I know. Some think Im just a playing around guy, who makes homevideos. And he thinks he needs 64GB of RAM. Well, I love computers and the development of it. My first 486, dx2 50mhz, did only have 4mb of RAM. So Im curious whats 64GB of RAM would take off =)

I hope 64GB 1600MHz is more effective then 32GB 2400MHz, I havent afford testing them both...

I read that GPU acceleration wouldnt affect the workflow, just the render time....
But people here, looks like they are disappointed. I read people buying the GTX 680 and could activate the GPU render. So I belive I go without a graphiccard, and maybe purchase one when people get one to work properley in Vegas Pro 11, and so I also know specifically which GPU/modell to choose....

/Kim

Gints Klimanis
May 15th, 2012, 02:50 PM
There is no need to apologize for being a hobbyist. I would be interested if you're willing to test 32GB vs 64GB.

As for the GFX card, what remains to be seen is whether Sony is able to make use of the GFX power. They claim 2-3x performance, but I'm not seeing this on the 560ti. Please let us know what you find with the 6x0 series.

"Benchmarking GPU Acceleration in Vegas Pro 11"
Vegas Pro 11 GPU acceleration (http://www.sonycreativesoftware.com/vegaspro/gpuacceleration)

"Five steps to GPU power in Vegas Pro 11" by Gary Rebholz
Five steps to GPU power in Vegas Pro 11 (http://www.sonycreativesoftware.com/GPU_power_in_Vegas_Pro_11)

"Sony Vegas Pro 11 Review: Now, With Graphics Acceleration"
Sony Vegas Pro 11 Video Software Review | PCWorld (http://www.pcworld.com/article/247239/sony_vegas_pro_11_review_now_with_graphics_acceleration.html)

Gary Bettan
May 15th, 2012, 05:26 PM
Sounds like a great machine. Very similar to our DIY9 guide
Videoguys Blog - Videoguys' DIY9: It's Time for Sandy Bridge-E (http://www.videoguys.com/Guide/E/Videoguys+DIY9+Its+Time+for+Sandy+Bridge+E/0xe9b142f408a2b03ab88144a434e88de7.aspx)


We're big fans of he GTX570 and it will I've you excellent GPU acceleration w Vegas Pro 11

Gary

Jeff Harper
May 15th, 2012, 08:26 PM
If you're going all this way Kim why not get the GTX 680 card? It has triple the cuda cores, which will mean increased performance and should help with preview playback. It's all about the number of cuda cores with Vegas. I agree you are seriously overdoing the RAM and will likely not benefit from it unless you need it for other apps.

Nice system, I'm a huge fan of overclocking, it's too easy to do these days.

Gints Klimanis
May 15th, 2012, 09:04 PM
Jeff wrote "It's all about the number of cuda cores with Vegas"

Jeff, do you see a speedup for video rendering to Sony AVC .mp4 or XDCAM EX .mp4? If so, which GFX card do you have?

Jeff Harper
May 16th, 2012, 01:40 AM
Gints, I turn off GPU acceleration most times for rendering, I've read others have seen lower quality when using it. My CPU is already at 4ghz+ so I don't need the relatively minor increase for rendering. I'm mostly concerned with preview playback as it is horrible in Vegas.

The number of cuda cores in Nvidia cards are the factor in GPU affecting playback, so that's why I point this out.

Harm Millaard
May 16th, 2012, 02:08 AM
Although not for Vegas, but for Adobe CS6, which should not make a huge difference in what is needed in a computer in terms of the components, you may find this page interesting: PPBM6 Planning a new system (http://ppbm6.com/Planning.html)

It is in the early stages of the build and I will add to this page as the build progresses.

Gints Klimanis
May 16th, 2012, 02:23 AM
For a 720p60 16 Mbits VBR rendering of 35 MBit/sec Sony XDCAM EX, I get these numbers with Sony Vegas 11 (Build 511) (64-bit) in Windows 7. Vegas was restarted before each rendering.

Sony AVC : No GPU= 2:57 GPU: 3:00

Main Concept AVC/AAC: No GPU= 1:11 GPU: 1:35

I'm using an eVGA nVidia 560ti 850MHz (Overclock off) with an Intel Core i7 980 3.33 GHz(No Overclock).
I have not benchmarked other Sony Plug-ins, so perhaps the advantage is greater during normal use.

Kim Olsson
May 16th, 2012, 03:34 AM
Thanks Gints Klimanis, I already looked at Sonycreativesoftwares own benchmark charts...
They looks great.

But for a minute, I wasent convinced. After reading alot of forums and reallife experience from people, using the GPU acceleration, it seems that many didnt get it to work properly. Some people even got some bad render quality...

But after this reading: Sony Creative Software - Forums - Vegas Pro - Video Messages (http://www.sonycreativesoftware.com/Forums/ShowMessage.asp?Forum=4&MessageID=782966),
I'll go for the GTX 570 or GTX 580. NOT the GTX 590, because VP11 doesnt support dual GPU at the time.

I was also reallly happy when I checked this: Sony Vegas Pro 11 with GPU Acceleration - YouTube
This confirms faster preview/workflow with GPU (VP10 2fps against VP11 20fps with "Glow" filter on)

And Jeff Harper, this is why I wasent leaning towards the GTX 680: Adobe Premiere CS5, CS5.5 and CS6 Video Cards with CUDA Acceleration Mercury Playback Unlock Enable MPE Hack Mod Tip (http://www.studio1productions.com/Articles/PremiereCS5.htm)

I know this big benchmark test was for Adobe Premiere CS6 and the Mercury Playback Engine.
But I think it has a connection. Please read it all, but u see further down, that the GTX680 wasnt really any faster rendering then the GTX570 (1second)!!

But will the GTX 680 have the advantage over GTX 570 when it comes to faster workflow/preview playback??!
Iam so confused.

And maybe I satisfy with 32GB of RAM (Quad channel).
But I have to wait, I found out that Windows 8 is comming soon.. Dont want to buy a new system, with an old operating system =)

Leslie Wand
May 16th, 2012, 04:14 AM
@ kim - unless you want to be on the cutting edge (and perhaps bleeding to death), i wouldn't wait for win 8 when the software you're using isn't tested on it. nor would i be in the rush for the first iteration of ANY new software, albeit nle or os. let some other poor bastard find out the problems first ;-)

Jeff Harper
May 16th, 2012, 06:20 AM
Kim, it's the number of cores in the GPU that affects Vegas, and the 680 has over double the number of the 570. I do not know how test for Premier relate to Vegas.

I would personally not spend all the money on a new computer and not take advantage of the 680, but that is just me.

Good luck, tough decision.

Harm Millaard
May 16th, 2012, 07:04 AM
Jeff,

Do not be lured by the simple number of CUDA cores in the GTX 680 over the GTX 580. They have completely different architectures and number of CUDA cores are just not comparable. My partner in crime, Bill Gehrke has tested the differences with PR CS5.5 and CS6 with both cards and the differences are small but the 680 is the winner on all aspects. There are a number of major benefits to the 680, apart from the performance gains - that may not apply to Vegas - and that is the lower power consumption, the lower temperatures and most importantly the 4 GB VRAM over the more limited memory of the 5xx series. Now, this will only make a difference if you use large stills or RED material, but I thought it worth mentioning.

Jeff Harper
May 16th, 2012, 08:49 AM
Interesting Harm, maybe Eric is better off not spending the extra if the performance is not much better. Nevertheless, for me personally, for a new system I would go with the fastest, as I'd feel good about it. I would buy the card today if my current card wasn't relatively new, less than a year old.

Harm I'm glad to hear someone has tested the card against the 570 but would still be keen to test the card's help with preview performance on my system. I'm just so sick of the terrible playback with Vegas.

With Edius trial I downloaded last week playback is flawless with multiple lines of 1080 24p, it just makes me sick. I learn so slowly, and I really don't want to learn Edius, it's so foreign, but wow what a program it is. It really does kill Vegas in a number of ways.

Harm, is the GTX 570 significantly better for Vegas preview performance than the GTX 460 with to your knowledge?

Harm Millaard
May 16th, 2012, 12:00 PM
Jeff, I haven't the faintest idea for Vegas. The 570 in PR is significantly faster than the 460. Have a look here: MPE Gain - PPBM5 (http://ppbm5.com/MPE%20Charts.php)

One of the major advantages - in some cases - of the 680 over the 580 is the capability to steer more than two monitors. I forgot to mention that.

For me, with a dual monitor setup and a TV connected but without third party cards like AJA, BM or Matrox, that is a huge advantage. It allows me to bypass the DV deck.

Gints Klimanis
May 16th, 2012, 12:45 PM
Jeff,
Do not be lured by the simple number of CUDA cores in the GTX 680 over the GTX 580. They have completely different architectures and number of CUDA cores are just not comparable.

Thanks, Harm. Is the difference closer to the memory bandwidth difference? There are certain types of math software (lots of operations that depend on few input points and produce few output points) that will benefit from more parallel computing units, but the most common video effects do not fall into that category.

The memory bandwidth limits the computing throughput of the CUDA cores. That is, the performance gain would involve over clocking (OC). I'll try the OC experiment, but so far, my eVGA 560ti decides that it doesn't want to work as GPU coprocessor at 5-7% OC with its built-in air-cooling. I'm debating on whether I should open the water block I bought for the 560ti or invest in water-cooling for the 6x0.

Gints Klimanis
May 17th, 2012, 01:23 AM
Results for overclocking an eVGA 560ti for Sony XDCAM (35 Mbps, 4:2:0) MPEG2 to MP4 15 Mbps VBR :

For CoreClock= 963 (+13%), ShaderClock= 1926 , Memory Clock= 2196 (+7%)
-------------------------------------------------------------------------
Sony AVC mp4= 2:58/3:00 (98%) MainConcept AVC mp4= 1:30/1:35 (95%)

There was no improvement moving to 983/1966/2243 or 943/1886/2462 for MainConcept. 1034/2068/2462 AND 994/1988/2258 were unstable and didn't complete. So, it's not an issue of overclocking the GPU memory.

Overall, I'm not seeing a GPU improvement when paired with an Intel Core i7 980x (no OC). It appears that investing in a better CPU or CPU overclocking results in a significant and linear improvement in render speed when overclocked.

Kim Olsson
May 17th, 2012, 12:52 PM
Iam not going to buy my new computer until july...
So I will study the most recent technologies which is useable to my needs...

For now, I'll think I go for 32GB (2133Mhz quad channel), instead of 64GB (1600MHz).

And Jeff... I belive I will test the new GTX 670, If your right that its all about cuda's, this performce equal to the GTX 680. Actually, the nVidia GeForce GTX 670, with manufacturer like asus, gigabyte, performs better then the original GTX 680 (reference card).

GeForce GTX 680 - |CUDA cores: 1536| Standard memory config: 2048 MB| Relative Graphics Perf. 95x
GeForce GTX 670 - |CUDA cores: 1344| Standard memory config: 2048 MB| Relative Graphics Perf. 84x
GeForce GTX 580 - |CUDA cores: 512| Standard memory config: 1536 MB| Relative Graphics Perf. 75x
GeForce GTX 570 - |CUDA cores: 480| Standard memory config: 1280 MB| Relative Graphics Perf. 68x

SO u see... For $100 - $150 cheaper, you get the new GTX 670 with small differences...

But in mean time, I will investigate further...

And I hope more people jump the wagon, and enlighten us!!

The big queastion is about GPU chip and RAM config. for VP11 =)

Jeff Harper
May 17th, 2012, 01:17 PM
Very interesting Kim, about the 670, please keep us posted!

Gints Klimanis
May 18th, 2012, 02:15 AM
[QUOTE=Kim Olsson;1733803]
GeForce GTX 680 - |CUDA cores: 1536| Standard memory config: 2048 MB| Relative Graphics Perf. 95x
GeForce GTX 670 - |CUDA cores: 1344| Standard memory config: 2048 MB| Relative Graphics Perf. 84x
GeForce GTX 580 - |CUDA cores: 512| Standard memory config: 1536 MB| Relative Graphics Perf. 75x
GeForce GTX 570 - |CUDA cores: 480| Standard memory config: 1280 MB| Relative Graphics Perf. 68x
/QUOTE]

Thanks, Kim. This GFX performance isn't from video editing apps, is it?

Harm Millaard
May 18th, 2012, 12:45 PM
Here is the table with Bill Gehrke's results for PR CS6. I don't know how this translates to Vegas, but at least it shows the benefits of the 680. Adobe Forums: Current sweet spot in GTX-4xx to -6xx graphics card? (http://forums.adobe.com/message/4420158?tstart=0#4420158)

Gints Klimanis
May 18th, 2012, 04:59 PM
Thanks, Harm.

Kim Olsson
May 18th, 2012, 06:34 PM
No Gints. Actually its relative to gaming...
Its a measurement so a gamer could compare a graphiccards performance.
Mayby it means something to apps aswell now when they can utilise GPU.

Peter Siamidis
May 18th, 2012, 08:46 PM
Hey guys figured I'd add some input to the fray. I use Vegas Pro 11 daily and do lots of encoding, so I've done many benchmarks on various setups. What I've been using for over a year now is this setup, with approximate prices:

Asus P8p67 motherboard, ~$150
Intel i7-2600k clocked at 4.7ghz, ~$310
Coolermaster V8 cpu cooler, ~$50
16gb 1600mhz ram, ~$80
Corsair AX850 power suppy, ~$160
NVidia 560ti, ~$220

I have Vegas scripts that do much of the work for me, so I run a script and it will add watermarks, fade ins, create directories and render out multiple versions of the videos. Anyways the above setup is relatively cheap, around ~$970 and it's been absolutely rock solid for me. For reference with the 560ti video card Vegas Pro 11 encodes my h264 versions around 4x faster than with just cpu alone, which is really nice. The i7-2600k's are also very overclock friendly, mine runs 4.7ghz all the time for ages now without issue, around 1.400v and ~73C at full load. I've seen people run then higher than that so I might have more breathing room there, but I'm leaving it at 4.7ghz for now.

I was considering switching to an NVidia 670, but because of the more limited design of it's cores (compared to the NVidia 5xx series) in some operations I've read that it may not be as good for cuda use right now. If someone had Vegas Pro 11 benchmarks for that card that would be great to read!

Gints Klimanis
May 18th, 2012, 09:25 PM
Peter wrote "For reference with the 560ti video card Vegas Pro 11 encodes my h264 versions around 4x faster than with just cpu alone, which is really nice. "

This I don't get. Are you using a lot of video FX? I just posted measurements nearly no performance difference when comparing the GPU (560ti) to the CPU (Intel Core i7 980x 3.33 GHz rendering for Sony XDCAM source files with no only video and audio crossfades to 16 Mbps h264 (720p60 or 1080p30). Of course, video preview is off. When I overclock the CPU, there is proportional increase in rendering speed.

Peter Siamidis
May 19th, 2012, 01:10 AM
This I don't get. Are you using a lot of video FX? I just posted measurements nearly no performance difference when comparing the GPU (560ti) to the CPU (Intel Core i7 980x 3.33 GHz rendering for Sony XDCAM source files with no only video and audio crossfades to 16 Mbps h264 (720p60 or 1080p30). Of course, video preview is off. When I overclock the CPU, there is proportional increase in rendering speed.

I'm not using any fx, just a watermark. I just did a quick render test, the source footage is from a Sony 560v camera and is 1920x1080 60p at 28mbps. I dropped a piece of footage into Vegas 11, and all I added was a watermark to it using the legacy text media generator tool which I've found to render faster than the newer text media generator. I rendered out a 1 minute segment of this video to this spec:

Mainconcept AVC
1920x1080
29.970fps
Two pass encoding
20,000,000 max bps
8,000,000 average bps
Progressive download enabled
2 Reference frames
Deblocking filter checked
High profile

I disabled smart resample. I have cpu and gpu meter gadgets that show how busy each device is. The result:

Cpu only:
5 minutes 30 seconds
Cpu cores bouncing between 92% and 100% use
Gpu at 0%

Cpu+gpu:
1 minute 34 seconds
Cpu cores bouncing between 92% and 100% use
Gpu bouncing between 54% and 71% use

So about 3.5x faster in that case. That's with Vegas Pro 11 build 683 64bit on Windows 7 64bit, and NVidia 296.10 drivers, 4.7ghz i7-2600k and NVidia 560ti (original model).


EDIT: I just did one more test, a bit simpler. Same type source footage and same render target, only difference this time is no watermark so it's the footage alone with no fx's, and I set the encoder to 1 pass. Results to encode 1 minute of footage:

Cpu: 2 minutes 37 seconds
Cpu + gpu: 37 seconds

In that case it's 4.2x faster encode time.

Randall Leong
May 19th, 2012, 12:51 PM
I'm not using any fx, just a watermark. I just did a quick render test, the source footage is from a Sony 560v camera and is 1920x1080 60p at 28mbps. I dropped a piece of footage into Vegas 11, and all I added was a watermark to it using the legacy text media generator tool which I've found to render faster than the newer text media generator. I rendered out a 1 minute segment of this video to this spec:

Mainconcept AVC
1920x1080
29.970fps
Two pass encoding
20,000,000 max bps
8,000,000 average bps
Progressive download enabled
2 Reference frames
Deblocking filter checked
High profile

I disabled smart resample. I have cpu and gpu meter gadgets that show how busy each device is. The result:

Cpu only:
5 minutes 30 seconds
Cpu cores bouncing between 92% and 100% use
Gpu at 0%

Cpu+gpu:
1 minute 34 seconds
Cpu cores bouncing between 92% and 100% use
Gpu bouncing between 54% and 71% use

So about 3.5x faster in that case. That's with Vegas Pro 11 build 683 64bit on Windows 7 64bit, and NVidia 296.10 drivers, 4.7ghz i7-2600k and NVidia 560ti (original model).


EDIT: I just did one more test, a bit simpler. Same type source footage and same render target, only difference this time is no watermark so it's the footage alone with no fx's, and I set the encoder to 1 pass. Results to encode 1 minute of footage:

Cpu: 2 minutes 37 seconds
Cpu + gpu: 37 seconds

In that case it's 4.2x faster encode time.

If you're seeing much faster performance in GPU mode compared to the CPU-only mode, then most likely your system has a bottleneck somewhere.

Peter Siamidis
May 19th, 2012, 01:00 PM
If you're seeing much faster performance in GPU mode compared to the CPU-only mode, then most likely your system has a bottleneck somewhere.

Not sure what you mean, all the cpu cores are pegged near 100% and not throttling due to temperature so they are maxed out both when using the gpu and when not using the gpu during encoding. I also get relatively linear increase in encoding speed when comparing cpu encode time stock, at 4.3ghz and at 4.7ghz. The harddrive has plenty of bandwidth to feed the encoder, even at 2x realtime encoding speed it's only ~56mbps of bandwidth needed and the harddrive can supply far beyond that even on a bad day. My numbers are not unusual, I've read others have the same encode speed increases from using a gpu.

Gints Klimanis
May 19th, 2012, 02:54 PM
Peter, I will try again with the same settings you posted. I don't have any 1080p60 material. Why don't we test with the same video clip? I'll post some Sony XDCAM footage. Since we both have the nVidia 560ti, we have a useful comparison.

Peter Siamidis
May 19th, 2012, 03:09 PM
Peter, I will try again with the same settings you posted. I don't have any 1080p60 material. Why don't we test with the same video clip? I'll post some Sony XDCAM footage. Since we both have the nVidia 560ti, we have a useful comparison.

Sure that's cool. Just fyi I have the original 560ti, not the re-released model that has more cuda cores. So mine has 384 cuda cores, compared to newer models that have 448.

Gints Klimanis
May 22nd, 2012, 06:14 PM
nVidia gives us some hints, but I'm not sure Sony uses nVidia's supplied CUDA encoders. When I use the MainConcept encoder, it's about twice as fast as Sony. It is also possible that I am MPEG2 decode-bound by Sony's MPEG2 decoder on XDAM footage rather than H.264 encode-bound.

http://www.geforce.com/Active/en_US/en_US/pdf/GeForce-GTX-680-Whitepaper-FINAL.pdf
"GeForce GTX 680 also features a new hardware-based H.264 video encoder, NVENC. NVENC is almost four times faster than our previous CUDA-based encoder while consuming much less power."

Will a driver update will improve CUDA H.264 encoding for older GPUs?

From the MainConcept site : CUDA H.264/AVC: MainConcept (http://www.mainconcept.com/?id=419)
"Using the MainConcept CUDA H.264/AVC Encoder, the whole H.264/AVC transcoding process is done on the GPUs, except for entropy encoding which is done on the CPU."

Peter Siamidis
May 22nd, 2012, 09:08 PM
To use that NVENC hardware encoder on the 6xx series Sony will have to patch Vegas Pro, I don't think just a driver update would do it. Thing is I'm always wary about speed claims. I've used so many gpu encoders and by and large they have all been crap, they sacrifice encode quality for speed. Vegas Pro is the first app I've used whose gpu encoder actually produces the same quality encodes as their cpu encoder and at a faster rate, which is awesome. So I'll wait for reviews on the NVENC encoder first. I'd be thrilled if it turns out to be a quality encoder though, and if Vegas Pro eventually supports it :)

Gints Klimanis
May 22nd, 2012, 10:03 PM
Agreed. The customer cited for GPU accelerated encode is Cyberlink. I use DVDFab and notice that the two-pass H.264 encoder stalls after a short while and never finishes the job. The one-pass is pretty good.

I also have the "older" 560ti with 384 cores and 1 GB GFX:
EVGA 01G-P3-1561-AR GeForce GTX 560 Ti FPB (Fermi) 1GB 256-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130604&Tpk=01G-P3-1561-AR

Coprocessor card manufacturers have a new market in which to expand. Why is a GPU considered to be "hardware" acceleration? Today, there is little to distinguish a multi-core CPU from a GPU other than the instruction set. Both are fully programmable.

Sony shows XDCAM (MPEG-2) acceleration with GPUs :
http://www.sonycreativesoftware.com/vegaspro/gpuacceleration

Kim Olsson
May 24th, 2012, 01:43 AM
Ok. Now im gettig stressed about all the theory...

My order is complete, and my new conputer will arrive next week...
This is what I ordered:

Ivy Bridge 3770k (OC to 4.5GHz)
Asus p8z77 v motherboard
32Gb Corsair Vengance 1600MHz
Gigabyte GTX 670 2GB RAM
Intel SSD 120GB 520 serie
Samsung 2TB HD
Corsair Hydro H100 CPU cooling

I have changed my setup because of some inputs from this thread...
Due to the more expensive graphiccard, I had to choose a cheaper CPU. Thats ok, because the 3930k will only affect rendertime, not the workflow. And I have no deadlines or something like that. But if Iam lucky, the GTX 670 will boost the workflow, to be faster. Ofcourse hopefully the rendertime too...
This setup can be called for "experimental", because even that GTX 670 have 1344 cuda cores and GTX 580 only 512, the card may not be compatible with Vegad Pro 11...

I will download some specific project test file to compare workflow and rendertime when I can
Of course I will share my information about this.

See u later.

Gints Klimanis
May 24th, 2012, 03:07 AM
If you're going all this way Kim why not get the GTX 680 card? It has triple the cuda cores, which will mean increased performance and should help with preview playback. It's all about the number of cuda cores with Vegas.

Kepler CUDA cores operate at half the clock frequency of Fermi cores. Unless you're doing particle simulation which stores equations (polynomial coefficients) in the register files on the GFX card, most applications will be limited by memory bandwidth. nVidia is touting operations / watt rather than total operaations for the 6x0 round .

Kim Olsson
May 24th, 2012, 03:49 AM
Personally I believe a GTX 570 would be better off, then a GTX 670 or 680.
But what a heck... i want to see it with my own eyes. Im tired walking around "thinking" the GTX570 is better..

;D

Im going to see some proof soon.

Anyway, now I have a reason to play Battlefield 3 ;)

Peter Siamidis
May 29th, 2012, 03:57 PM
I just bought a 670 today, so I'll post some encode timing comparisons later today.

EDIT: Well scratch that...with the 670 installed Vegas 11 just gives an unknown error when trying to encode, and fails. So stick to the 5xx series of cards :)

Jeff Harper
May 29th, 2012, 10:36 PM
Bummer Peter, so sorry to hear that.

Kim Olsson
May 29th, 2012, 11:26 PM
The graphiccard should be compatible with VP11. It has the requirements listed on sonymediasoftwares website... It can be a conflict with your old drivers... Or else it is only a question of time until the gpuchip will work inside VP11.
I havent got my computer shipped yet. But I will have a clean install, only vegas installed with the newest drivers..
Well great you updated us Peter.

Jeff Harper
May 29th, 2012, 11:33 PM
Kim, very good point. Peter, did you do a complete uninstall of your old video drivers? I have neglected to do that and found that when I uninstalled the old drivers completely, things straightend right up.

Gints Klimanis
May 30th, 2012, 01:19 AM
I'm sure Peter is sharp enough to install the drivers that were supplied with his brand new GFX card. I've seen similar warnings/stalls from Vegas during my GFX card overclocking experiments. Frankly, the information provided by these pop-up dialog boxes is completely inadequate. Such messages are no better than "Error -1009"

Jeff Harper
May 30th, 2012, 01:31 AM
Gints, we are not talking about whether he installed the right drivers, but instead whether he uninstalled his old ones first. This is easy to overlook, and I've done it myself. EVGA, for example recommends using a driver removal tool before installing new drivers.

Gints Klimanis
May 30th, 2012, 11:58 AM
Jeff, which driver removal tool do you recommend? thanks.

Jeff Harper
May 30th, 2012, 12:18 PM
Gints, I used DriverSweeper, it was free and was mentioned on the EVGA website, so I trusted it. There are those that say they are unnecessary and the idea of using such software is is apparently "controversial" in some quarters, but don't let any web talk distract you. It worked fine, you just have to be very careful not to uninstall anything you shouldn't.

Google "how to uninstall video drivers" or go to your manufacturer site to get the general idea of what the best practices if you want.

Peter Siamidis
May 30th, 2012, 08:12 PM
I did a clean install of the newest driver, which for NVidia drivers means it uninstalls the old drivers then installs the new ones. I didn't use any driver sweep software though, never really used those.

EDIT: Just saw this posted on the Sony forum regarding NVidia 670 and the newest NVidia drivers: "yup. still sees the 670 card but will not make use of GPU render under mp4 settings.". So looks like there is an issue for now, have to wait this one out.

Randall Leong
May 31st, 2012, 06:30 AM
I did a clean install of the newest driver, which for NVidia drivers means it uninstalls the old drivers then installs the new ones. I didn't use any driver sweep software though, never really used those.

EDIT: Just saw this posted on the Sony forum regarding NVidia 670 and the newest NVidia drivers: "yup. still sees the 670 card but will not make use of GPU render under mp4 settings.". So looks like there is an issue for now, have to wait this one out.

Honestly, I would not use GPU rendering in Vegas if video quality is important because GPU rendering sacrifices quality for performance. I much, much prefer CPU-only rendering with this software regardless of the GPU used. Here's why:

First, all GPUs are clocked far slower than most really cheapo CPUs to begin with. Second, very few GPUs have a total data throughput that matches even that of a mediocre-performing CPU. And third, most GPUs idle much hotter than most CPUs. Put all three of them together, and GPU rendering is just not ready for prime-time right now.

Jeff Harper
May 31st, 2012, 07:08 AM
I completely agree with Randall, I do not like using GPU acceleration for rendering, never have liked it. The time savings is not worth the quality loss. Besides, Sony AVC takes a stupidly long time to render anyway, and is not superior to MPEG 2 and I just don't see the attraction to it.

My only draw to the GPU acceleration feature is playback performance.

Randall Leong
May 31st, 2012, 07:38 AM
I completely agree with Randall, I do not like using GPU acceleration for rendering, never have liked it. The time savings is not worth the quality loss. Besides, Sony AVC takes a stupidly long time to render anyway, and is not superior to MPEG 2 and I just don't see the attraction to it.

My only draw to the GPU acceleration feature is playback performance.

I have to admit that GPU rendering can be useful if it's correctly implemented (such as in Adobe Premiere Pro, which uses GPU acceleration for certain effects and scaling/resizing operations but performs all of the encoding entirely on the CPU). Unfortunately, Vegas and a few other NLEs also use GPU acceleration for encoding whenever the feature is enabled - and it is the GPU-based encoding that I really meant to criticize.

And yes, Sony AVC is artificially restricted to such low bitrates (16 Mbps or lower) that you might as well lock Vegas to MPEG-2 for all DVD and Blu-ray encodes.

Peter Siamidis
May 31st, 2012, 10:22 AM
I completely agree with Randall, I do not like using GPU acceleration for rendering, never have liked it. The time savings is not worth the quality loss. Besides, Sony AVC takes a stupidly long time to render anyway, and is not superior to MPEG 2 and I just don't see the attraction to it.

My only draw to the GPU acceleration feature is playback performance.

I agree that in general that is true as I've tried dozens of gpu encoders and they all sucked, but I find Vegas 11 to be an exception, the quality of the gpu encodes looks the same as the cpu encodes to me. So far Vegas 11 is the only program I've ever encountered that doesn't sacrifice qualtiy for speed. I really looked at them back to back for a while as did others and we could not notice any quality difference, hence I've been using gpu encodes for quite some time now. The ~4x faster encode time has really helped my business as well, I can take on many more projects which has been fantastic.

Do you really see a quality difference when doing an a/b comparison of Vegas 11 cpu/gpu encodes? I mean a bunch of us looked and looked, we spent a few days comparing the Mainconcept AVC encoder on both cpu only and gpu and we just can't see any difference in the quality. I don't know about the Sony AVC encoder as I never use that one, I only use the Mainconcept AVC encoder. As far as our eyes can tell after much examiniation, the cpu and gpu encodes are equally good. I ask if you can really see the difference because I realize that historically gpu encodes have always sucked so that prevailing thought of not using the gpu on encodes remains, I was the same way for ages, but Vegas 11 finally changed my opinion on that.