|
|||||||||
|
Thread Tools | Search this Thread |
November 13th, 2009, 08:25 AM | #1 |
Major Player
Join Date: Mar 2007
Location: Salem, Oregon
Posts: 435
|
Interesting Observation on Utilization of CPU for Vegas 8.0: Vista vs. Windows 7?
I created a project in 8.1 on my freshly installed Windows 7 x64 system. From experience on Win 7 and on Vista x64, I know that 8.1 utilizes ~80% of all of my cores (Core 2 Quad) uniformly on renders, judging from the Performance tab on the Task Manager. Such was the case on my new project, on which I had applied Neat Video, obviously as its 64 bit flavor.
It turns out that 8.1 was so danged buggy that I gave up and opened the same project in 8.0c to render. I did not change anything about the NeatVideo FX; it was still applied. Upon render in 8.0c, I was surprised to see that, according to Task Manager, Vegas was still using ~80% CPU!! This is in stark contrast to what I observed for 8.0c renders on Vista x64 and current Win 7 x64 systems, where CPU usage on renders was never more than 35% and only then on a few cores. Rendering threads = 1 and RAM preview = 0 in all instances. So what gives here? I realize that this is anecdotal, having not tested all parameters, but it looks like I'm getting great CPU usage in 8.0c on Windows 7 x64, essentially the same as when I attempted renders in 8.1. Is this because of the change in OS? Is it because the 64 bit version of NeatVideo somehow "stuck" when I opened the project in 32 bit 8.0c? I'm not complaining about the enhanced performance, but it is nonetheless an unexpected (to me) benefit. I'm curious to know the reason. Any thoughts? Steve |
| ||||||
|
Thread Tools | Search this Thread |
|