View Full Version : Building a new SuperComputer for Vegas Pro 11!


Pages : 1 [2]

Randall Leong
May 31st, 2012, 11:09 AM
I agree that in general that is true as I've tried dozens of gpu encoders and they all sucked, but I find Vegas 11 to be an exception, the quality of the gpu encodes looks the same as the cpu encodes to me. So far Vegas 11 is the only program I've ever encountered that doesn't sacrifice qualtiy for speed. I really looked at them back to back for a while as did others and we could not notice any quality difference, hence I've been using gpu encodes for quite some time now. The ~4x faster encode time has really helped my business as well, I can take on many more projects which has been fantastic.

Do you really see a quality difference when doing an a/b comparison of Vegas 11 cpu/gpu encodes? I mean a bunch of us looked and looked, we spent a few days comparing the Mainconcept AVC encoder on both cpu only and gpu and we just can't see any difference in the quality. I don't know about the Sony AVC encoder as I never use that one, I only use the Mainconcept AVC encoder. As far as our eyes can tell after much examiniation, the cpu and gpu encodes are equally good. I ask if you can really see the difference because I realize that historically gpu encodes have always sucked so that prevailing thought of not using the gpu on encodes remains, I was the same way for ages, but Vegas 11 finally changed my opinion on that.

True unless you're authoring Blu-rays with DVD Architect. The HD content "rendered" using the Vegas MainConcept AVC encoder is not compliant with the Blu-ray spec, and thus DVD Architect must re-compress this material, destroying video quality in the process.

Peter Siamidis
May 31st, 2012, 11:28 AM
True unless you're authoring Blu-rays with DVD Architect. The HD content "rendered" using the Vegas MainConcept AVC encoder is not compliant with the Blu-ray spec, and thus DVD Architect must re-compress this material, destroying video quality in the process.

Ah ok that makes sense, I don't author for Bluray so I don't encounter that. How about the Sony AVC encoder, I think it supports gpu as well. Does it produce lower quality encodes when using gpu compared to when using cpu? I've never really tested that one.

Gints Klimanis
June 3rd, 2012, 05:42 PM
I completely agree with Randall, I do not like using GPU acceleration for rendering, never have liked it. The time savings is not worth the quality loss.

Jeff, would you be able to describe how to create a render in which there is a visible difference between GPU and CPU rendering? I'm not seeing a difference with Sony XDCAM 35 Mbps MPEG2 source files (Sony EX1 camera) and either MainConcept or Sony AVC render to MP4 with/without GPU acceleration.

From what I've read, GPU rendering by MainConcept uses the CPU for part of the encoding and the GPU for the rest. Much of the criticism is that many companies use nVidia's reference CUDA GPU encoder source code and for their GPU-accelerated product, and a different implementation for the CPU encoder. A GPU is like a CPU with thousands of processor registers and tons more floating-point engines. GPUs will be much, much faster on highly iterative data that fits in the register file but not as fast on problems that are dependent on memory bandwidth available to the 300-1500 GPU cores. But I plan on writing my first CUDA video effect this month to understand the difference.

Jeff Harper
June 3rd, 2012, 06:41 PM
Gints, I have no clue. I know early on I didn't like the results with GPU rendered footageand in the end I don't care a whole lot, I'll tell you why.

I like mpeg 2 for bluray anyway, and it doesn't utilize GPU anyway, so the point is moot in my case.

I personally do not care about improving rendering times with GPU, it's playback on the timeline I care about most.

GPU rendering improvements are so minimal with my OC'd CPU it just doesn't matter even when I do use it.

Good luck with your effect.

Leslie Wand
June 4th, 2012, 02:14 AM
agree with jeff - timeline playback IS the goal, rendering is of secondary importance.

Gints Klimanis
June 4th, 2012, 01:37 PM
GPU rendering improvements are so minimal with my OC'd CPU it just doesn't matter even when I do use it.


Agreed. I'm just trying to figure out why Sony states GPU rendering is about 3x faster while I'm not seeing any acceleration.

Jeff Harper
June 4th, 2012, 03:09 PM
I remember anecdotal evidence from around this forum early on that GPU acceleration seemed to be more dramatic with slower CPUs, and less so with faster CPUs. Obviously there are variables. The GPU speed vs the CPU speed, I don't know what the "right combo" is for optimal improvement, but I seem to remember, as I said, that those with older or slower CPUs benefited more, but I have no proof of that. I might be way off or have it backward, I have no idea, but the bottom line is improvement varies with your setup, as far as I can tell.

I only know that I'm using an overclocked quad-core hyperthreaded CPU that's running at over 4GHZ, and more rendering speed is not what I need. It is always nice when I can get it, of course, but it's not a priority.

So, I'm still frustrated when video files, running off of a RAID 0 array of SAS 15k rpm disc drives that is not even 25% full run roughly as I edit a four camera 1080 24p job.

I've contemplated a move to Edius for better playback, but I chickened out, again.

Kim Olsson
June 6th, 2012, 09:55 AM
Ok "SuperComputer have arrived" !! =)

As I said before, this is my setup:

Antec P280 case
Ivy Bridge 3770k (Not yet OC'ed)
Asus p8z77 v motherboard
32Gb Corsair Vengance 1600MHz
Gigabyte GTX 670 2GB RAM
Intel SSD 120GB 520 serie
Samsung 2TB HD
Corsair Hydro H100 CPU cooling

I must say, Iam really really impressed with the workflow in VP11 and also the rendering so far....
My CPU temp for my short rendering test was 38 degrees, mother board 30 degrees.
Idle its in 26 degrees,.

- Vegas have found my GTX 670 in the preference-video tab. Thats good
- When choosing rendering, customize template - check GPU, it says "CUDA is availible"

But when I choose Mainconcept AVC/ACC *.mp4.. I get a Error dialog poped up. "The reason for the error could not be determinated"

I tested with 3 different different materials on the timeline each time.
Photos, videos from my iphone 4s (1080p), Videos from my nikon P300 (1080p).

Any project test file I should run?

Peter Siamidis
June 10th, 2012, 07:13 AM
That's the exact problem I had with the 670. The card is detected, it says "Cuda available" and it gives that error on encoding. I read that others have the same issue, no one seemed to have any solution so as far as I know we'll have to wait for Sony to patch Vegas to get 670 support. In the meantime I've been sticking with my trusty old 560ti.

Adam Stanislav
June 10th, 2012, 08:57 AM
And now there is GeForce GTX 690 - GeForce (http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-690), so maybe the price of the 680 will drop?

Kim Olsson
June 11th, 2012, 12:00 PM
No pricedrop on that...

the GTX 690 is just a GTX 680 times two..
It has 2x GTX 680 GPU on the card..

Adam Stanislav
June 12th, 2012, 10:51 AM
It has 2x GTX 680 GPU on the card..

Interesting. Do they share their memory? In other words, can they work as if they were one GPU with twice the number of CUDA cores? So, would they be presented to Vegas as one OpenCL device or as two? I’m asking because even if there are several OpenCL devices installed on a system, Vegas will work with only one (or at least one at a time).

Kim Olsson
June 12th, 2012, 12:23 PM
Well, the card cost about $1300 here in Sweden, so I doubt any regular videoediting guy would like to buy and test. But anyway, VP11 does only support 1 gpu unit what I know. Doesnt matter if its two seperate cards, or one card with x2 gpu. VP11 will let you choose whitch one to use...
VP is kind of new with using GPU power, and not even the supported GTX 670 (according to its own website) does work at the moment. The fastest card working at the moment, is the GTX 570, which they self demostrate as an example on their own website.
Its really bad if sony have to manually add every new card when they arrives, to be supported.. Because they dont work very fast.

Kim Olsson
June 18th, 2012, 03:12 PM
Hi! Just found this answer on some Q&A at sony's website regardings GPU cards having higher "compute capability" then 2.0,,,

"NVIDIA GPUs with Compute Capability prior to 2.0 are currently not enabled for GPU-accelerated video processing. We are working with nVIDIA on this issue. Click here for a list of Compute Capability levels for various nVIDIA GPUs"

That would include GTX 670, 680 and 690.... Because they is listed as Compute Capability 3.0...

the good side is: the high end graphic cards like GTX 670, 680 and 690, will soon be supported!

The bad side is: This was published 10/15/2011. Wow I told you Sony doesnt work in a fast tempo...
They doesnt seems to priority this subject. I dont think they will support the cards until SVP12 =/

heres the link if someone wants https://www.custcenter.com/app/answers/detail/a_id/5067/kw/nvidia

Adam Stanislav
June 19th, 2012, 08:37 AM
"NVIDIA GPUs with Compute Capability prior to 2.0 are currently not enabled for GPU-accelerated video processing.

The keyword is prior, i.e., before. Compute Capability 3.0 is not prior to 2.0. As long as the video driver for a 3.0 card supports OpenCL, Vegas should be fully able to use it.

Kim Olsson
June 19th, 2012, 11:12 AM
Ops. thanks for that. Im from Sweden, so my english isnt 100%...

Well anyway, I have GTX 670, which is 3.0, and it does support the requirments, but that one isnt working.
I have the latest drivers (Not the new BETA's, might test that one).... On a new freshly installed PC with Windows 7 x64

well se what happend in time

Ray Turcotte
June 22nd, 2012, 02:17 AM
Sorry for stealing the topic but the title and discussion is appropriate for what I'm going to post...

I've just upgraded from a 3 year old Q9300 quad core Acer computer (with nvidia GT440 Vista 64, 6GB memory), that I bought in late 2010, to the following custom built workstation that I spec'd myself:

Supermicro Bare Bones SYS-7035A-I with Supermicro X9DAi motherboard.

Supermicro | Products | SuperWorkstation | Mid-Tower | 7037A-i (http://www.supermicro.com/products/system/tower/7037/sys-7037a-i.cfm?parts=show)

(This system is loaded with PCI-E3.0 slots, SAS ports, Sata3 Ports, usb3 ports and FW800 ports... more than I need)

Single E5-2687W Xeon 8 core/16 threads (A second e5-2687W will be future upgrade)
32GB Kingston PC1600 ecc memory
Nvidia GTX670

1 500GB WD Caviar Black Sata3 HD (Boot Drive)
2 1TB WD Caviar Sata2 Drives in Intel Raid 0 mode (source edit drive - transfered from previous system)
External USB3 raid array for daily backup and transfer/safe keeping (transfered from previous system)
Multimedia Card Reader

Dual Monitor, keyboard, mouse & shuttle (transfered from previous system)

Windows 7 Pro 64bit Sp1

Anyways my point being...

On my old quad core system, my current vegas project with a 1.5 hour long time line would pin the Cpu to 100% and take 13 hrs to render.

On my new Supermicro system...the same project with more effects and edits - with GPU rendering disabled - 2 hrs to render! (the CPU at 63% usage while rendering)

I can now open another vegas project and continue on editing while rendering out in the background..I can be super efficient and productive now. I couldn't be happier! ...and with a modest investment I can double the system performance again. If you doubt that, check out

PassMark Intel vs AMD CPU Benchmarks - High End (http://www.cpubenchmark.net/high_end_cpus.html)

and

PassMark Software - CPU Benchmarks - Multiple CPU Systems (http://www.cpubenchmark.net/multi_cpu.html)

I have a SuperComputer that will outperform and last for a good number of years.

There is one gotcha that one must be aware of when dealing with Dual CPU systems...removing cpu2 in a dual cpu motherboard disables half of the PCI-e slots. This was not an issue for my build as I do not need all of the slots at this time.

And all done for $1000 Canadian or so more than what high end i7- 3960x system would cost.

No overclocking, no liquid cooling.

I couldn't be Happier!

Money well invested, as far as I'm concerned.

Cheers
Ray

Kim Olsson
July 21st, 2012, 05:30 PM
Some time have gone by sinced I bought my computer. Still not able to render with GPU activated... I have to choose "use CPU only" when its possible to render with that option...

Anyway, it does have performance benefits... Video FX and transitions is GPU accelerated. And also many plug-ins... The general workflow is hard to measure, because Iam to lazy to remove my graphiccard and test Vegas without it...

I noticed performance increase, when I clocked the CPU from 3,5Ghz to 4,2Ghz. Both with rendering and workflow. Not with very much, but noticable...

The 32GB in RAM makes Windows handle VP11 like it was an MSN windows or something. With that I mean, VP11 isnt no longer the big application you start which takes up your whole computers resources. Not even when you rendering!!

Ray T - Nice your happy with your setup. But you should really also look into SDD flash drives as your system disk... It takes around 20 sec to startup my windows 7 x64 and engage all boot applications and drivers...
And the time to install and load up applications, is ridicoulos... fast!

If rendering out project is the most importance for you, the amount of cores is very important. Of course high clocked CPU does also helps rendering... and since VP11, the graphiccard can increase the speed for rendering (Not all formats or codecs is supported with GPU rendering, just few).

But for workflow inside SonyVegas, its all about high clocked CPU, and since VP11, a graphiccard with alot of qudas are now great for boosting up workflow (by some). RAM does also make things "happening" faster!!

Every mans setup for SonyVegas is for what he/she prioritize, or have afford with.. there are no "one and only" setup...

Happy editing everyone!

Kim Olsson
April 9th, 2013, 03:06 PM
Time to light up an old thread!!

I did overclock my i7 3770k to 4.6GHz, couple of month ago. And its 100% stable. I use a Corsair H100 (closed watercooler) and never have a temperature over 55°C when rendering...

The workflow on 1080p .mov files (48Mbps = 6 MB/sec) is realtime editing without stuttering or hickups (Preview set to "BEST, FULL")... If my project is set to 1080p, 29,970 progressive, my preview will also playback at 29,970...

Of course when adding third party plug-ins and having a more complex project, I lower the preview to "preview, half"

My graphic card is an nVidia GTX670 Gigabyte, which also is overclocked (+100MHz gpu core clock, +500MHz memory clock)
I Do have GPU acceleration set to ON, because it has never made my system instable...
Infact, It does make the workflow "faster" when dealing with many clips/effects/tracks and so on...

I choosed the 3770K just to be able to OC as high as I can, still having low temperatures and being stable... the 3930K is different. It generates more heat because it haves more cores/threads. Its faster when it comes to rendering because it has more threads working. But for Workflow im 90% sure that an OC 3770k at 4.6GHz, is faster then a 3930k 4.2GHz (resonable overclock for the CPU without being unstable)...

But up to the test.. Anyone have the i7 3930k (overcloked or not), that wants to benchmark a project, so we can compare with an OC 3770K ???

The benchmark will both measure preview playback inside Vegas pro 12 (or 11), and renderspeed of the project... Well the benchmark has to be measured manually by you.

If someone is up to the test, I fix a 1080p, 29.970 project, one version without FX and one with some CC or something on it, uploading it somewhere for you to download it.

interesting?

I hate guessing!!

/happy editing!

Gerald Webb
April 9th, 2013, 09:03 PM
. Anyone have the i7 3930k (overcloked or not), that wants to benchmark a project, so we can compare with an OC 3770K ???


Hi Kim,
I have a 3930k, only running at stock speeds though.
Lets do it, tempt me to OC again :)

Kim Olsson
April 9th, 2013, 11:16 PM
Great Gerald! =D

Im going to fix a project this weekend for us to test...

May take a couple of days. Any special suggestion from your side? Something special fx or settings you want me to add?

By the way, do you have magic bullet? Just so I know if I should use it in the project or not...

/Kim

Jeff Harper
April 10th, 2013, 01:34 AM
I have an 3930 at 4.0. Would be happy to test your sample.

Gerald Webb
April 10th, 2013, 03:22 AM
Great Gerald! =D

Im going to fix a project this weekend for us to test...

May take a couple of days. Any special suggestion from your side? Something special fx or settings you want me to add?

By the way, do you have magic bullet? Just so I know if I should use it in the project or not...

/Kim
Yes Kim , throw some Mag Bullets at it.
Neat video is a good bench mark as well if you have it.

Kim Olsson
April 10th, 2013, 08:00 AM
Jeff, fun that you would like to join us...

Gerarld - Ok, I will use a clip with Neat video on it.

Soon as I have built an project, I will give you guys the location where you can download it from. I will put a .txt with instructions on what to benchmark on, so we can compare the resaults with eachother..

Ofcourse we share the resaults here on the forum later, so others can learn from the experience...

If someone would like to contact me, my mail adress is digitalflaskpost@gmail.com

Earlier today I sprained my foot in my work, but I will try to get it up this week!!!

Jeff Harper
April 10th, 2013, 08:15 AM
Sounds great, can't wait.

I would prefer a sample with no effects if possible.

Kim Olsson
April 10th, 2013, 11:01 AM
Yes Jeff. One projectfile will be without any cc, filters, fx etc
Just the clips on some videotracks :D

Jeff Harper
April 11th, 2013, 05:19 PM
Cool, should be fun to do!

Kim Olsson
April 15th, 2013, 05:15 AM
Jeff, Gerarld... Sorry for the delay, but I did got a fracture in my foot. So while a lot did happend, I havent got the time computing...

I'll keep in touch...

/Kim

Jeff Harper
April 15th, 2013, 06:24 AM
Whoa, sorry to hear about your foot Kim! We'll be here! Please have a speedy recovery!

Kim Olsson
May 4th, 2013, 01:43 PM
OK, Iam working on it... I think it will be ready maybe tomorrow...
Do you guys have Prodad Mercalli stabilizer ?

Gerald Webb
May 4th, 2013, 06:01 PM
No Kim, but could download a trial if they have one.

Kim Olsson
May 5th, 2013, 03:42 AM
They do have a demo. I think the output video from mercalli demo version creates a watermark in the center... I think we could use this plugin in a clip of the project just to measure the playback speed because it take much from you CPU.

Kim Olsson
May 5th, 2013, 05:58 AM
Im sorry, We shall not use Mercalli... It does prerender the file while stabilizing....

=/

Kim Olsson
May 5th, 2013, 02:02 PM
Ok, I will upload the project files in a couple of hours... Im going to lookup a cloud storage, for you to download it from.. So later when I have uploaded the project, I'll post...

See you soon!

Kim Olsson
May 5th, 2013, 05:21 PM
Guys, I will upload the files over night, the time for me now is 01:06AM (nighttime). I only have 1mbit upload, so it will take a couple of hours...
Tomorrow I'll share the location. I created a account at BOX.com. Hope that will work.

Kim Olsson
May 6th, 2013, 07:03 AM
Whoa, sorry to hear about your foot Kim! We'll be here! Please have a speedy recovery!

Thnx!

I have uploaded the project and the files to a cloud service (Box.com).
The instructions is inside that folders...

Here is the Link: https://www.box.com/s/lyiu1cgh1jlki26ywppd

I'll have the link open for a week or two!

Mail me your results when you are finished to digitalflaskpost@gmail.com
I'll write them all down later...

/Happy editing =)

Kim Olsson
May 6th, 2013, 07:04 AM
No Kim, but could download a trial if they have one.

Im finished, go ahead and read my post above! =)

Gerald Webb
May 7th, 2013, 09:03 PM
Sorry Kim, Will get to it in a cpl of days. Im working in Final Cut ATM (haven't booted into Windows for a cpl of weeks). Just have to get these jobs out the door first.
cheers.

Kim Olsson
May 8th, 2013, 10:44 AM
Np. Well keep in touch.

Kim Olsson
May 14th, 2013, 02:16 PM
Jeff! Its over here! ;D

Check 4 post above...

Jeff Harper
May 14th, 2013, 03:40 PM
Thanks Kim!

Kim Olsson
May 20th, 2013, 05:31 PM
I have post the results in a new post "Benchmarking i7 3770k, 4.6GHz vs i7 3930K, 4.2GHz and 4.0GHz".
Check it up when we test different CPU's when editing in Vegas Pro.