GPU versus CPU rendering at DVinfo.net

Go Back   DV Info Net > Windows / PC Post Production Solutions > What Happens in Vegas...

What Happens in Vegas...
...stays in Vegas! This PC-based editing app is a safe bet with these tips.


Reply
 
Thread Tools Search this Thread
Old June 6th, 2014, 10:00 PM   #1
Regular Crew
 
Join Date: May 2008
Location: Burlington, Wisconsin
Posts: 137
GPU versus CPU rendering

Is CPU render quality inherently better? I thought I read this somewhere.
Mervin Langley is offline   Reply With Quote
Old June 7th, 2014, 02:54 AM   #2
Trustee
 
Join Date: Mar 2004
Location: upper hunter, australia
Posts: 1,368
Re: GPU versus CPU rendering

apparently gpu ISN'T as good with low bit rates. but don't take my word for it ;-)

one thing for sure though - cpu is a lot more reliable ;-(
__________________
www.lesliewand.com.au
Leslie Wand is offline   Reply With Quote
Old June 7th, 2014, 10:22 AM   #3
Trustee
 
Join Date: Jan 2013
Location: Austin, Texas
Posts: 1,704
Re: GPU versus CPU rendering

Quote:
Originally Posted by Leslie Wand View Post
apparently gpu ISN'T as good with low bit rates. but don't take my word for it ;-) one thing for sure though - cpu is a lot more reliable ;-(
Not sure where you are getting this information. That the GPU "isn't" as good with "low bit rates" doesn't make sense. Do you have a source for this?
Gary Huff is offline   Reply With Quote
Old June 7th, 2014, 10:40 AM   #4
Inner Circle
 
Join Date: Jun 2005
Location: Cincinnati, OH
Posts: 8,421
Re: GPU versus CPU rendering

I have read the same thing, Merv. I've seen in in a couple of forums, that GPU quality is less than CPU quality when it comes to rendering.

The way it was explained (which I don't recall) made sense to me, but I know nothing about processor design. The CPU on most desktops is a much more powerful processor than the GPU, and for some reason it was said the GPU sacrifices quality in order to get the job done.

Is it true? I have no idea.
__________________
http://JeffHarperVideo.com
The horror of what I saw on the timeline cannot be described.
Jeff Harper is offline   Reply With Quote
Old June 7th, 2014, 11:08 AM   #5
Trustee
 
Join Date: Dec 2002
Location: Gwaelod-y-garth, Cardiff, CYMRU/WALES
Posts: 1,215
Re: GPU versus CPU rendering

Surely, if it's digital, one can't be better or worse than the other? The only way you might get a quality difference is in the choice of rendering codec...
__________________
TV Director / Cameraman
Robin Davies-Rollinson is offline   Reply With Quote
Old June 7th, 2014, 11:41 AM   #6
Inner Circle
 
Join Date: Jun 2005
Location: Cincinnati, OH
Posts: 8,421
Re: GPU versus CPU rendering

This article leads me to be open to the possibility that GPU "could" be inferior to CPU encoding. Again, not enough to prove anything to me, as a Vegas user.

For us to say it makes no sense is understandable, but we should be open-minded to the idea.

H.264 encoding - CPU vs GPU: Nvidia CUDA, AMD Stream, Intel MediaSDK and x264 (page 27: Conclusion) - BeHardware
__________________
http://JeffHarperVideo.com
The horror of what I saw on the timeline cannot be described.
Jeff Harper is offline   Reply With Quote
Old June 7th, 2014, 12:12 PM   #7
Inner Circle
 
Join Date: Jun 2005
Location: Cincinnati, OH
Posts: 8,421
Re: GPU versus CPU rendering

Quote:
Originally Posted by Robin Davies-Rollinson View Post
Surely, if it's digital, one can't be better or worse than the other?
Robin, what you say makes sense, but by the same token, if all of the codecs are digital they should all produce optimal results also.

Video encoding is extremely complex and an knowledgeable answer to this question might be hard to come by.
__________________
http://JeffHarperVideo.com
The horror of what I saw on the timeline cannot be described.
Jeff Harper is offline   Reply With Quote
Old June 7th, 2014, 01:42 PM   #8
New Boot
 
Join Date: Mar 2014
Location: Malaga (Spain)
Posts: 23
Re: GPU versus CPU rendering

Empirical evidence, ¿in low bit rates is better CPU? On my system I think not.

https://drive.google.com/file/d/0B9o...it?usp=sharing

https://drive.google.com/file/d/0B9o...it?usp=sharing

Sony Vegas 13
nVidia GTX 550i
Sony AVC.
Juan Rios is offline   Reply With Quote
Old June 7th, 2014, 03:07 PM   #9
Inner Circle
 
Join Date: Sep 2003
Location: Portland, Oregon
Posts: 3,259
Re: GPU versus CPU rendering

I've been scratching my head over this thread, how could it be that an "acceleration" of a process would change the quality? After a little research I'm coming to some tentative conclusions:

"GPU Acceleration" is a misnomer. I think it's a marketing concept that is too dumbed down to illustrate the actual process. It's not descriptive. It leads us to think that it is taking the same process we do on the CPU, and somehow just doing it faster by having specialized hardware in the GPU take the load.

After a quick skim of the NVIDIA white papers on the subject, I think what's actually happening in a "GPU accelerated" render is that the render is offloaded to a hardware encoder in the GPU, that has its own embedded h.264 codec.

If that's true, it readily explains differences in quality, and potential variations between low-bitrate and high-bitrate qualities. Particular codecs are optimized for various tasks, and those of us who've been through numerous h.264 codecs have all realized that the x264 codec (not directly available from Vegas' render-as, or from an nVidia GPU-assist encode) is clearly superior to all others. This is also supported by a university study...

IMO, various manufacturers of encoding hardware and software (including MainConcept, Sony, and nVidia) don't spend enough effort optimizing low bitrate performance. To be fair, time is on their side, that is, all the internet pipes are getting bigger, including those connecting mobile devices.

Instead, the implied message from these manufacturers is "just give more bitrate to your encodes, then our quality is as good as everyone's..." As a certified Old Fart in streaming, that rubs me the wrong way, especially since the x264 open source codec so clearly is capable of better quality.

But, if you have to get 50 encodes out before 5pm, a single step process where you can more quickly encode straight out of Vegas starts to look pretty good, so why not give a little more bitrate to maintain quality?

I'm doing most of my mastering out to MXF out of Vegas, a fairly fast render, then to the wonderful,fast, and free Handbrake (a leading GUI for x264). It's a two-render process, not of value to those who need 50 by 5pm, but the quality is second to none.

My conclusion: It isn't about the GPU or CPU, it's about the codec. If someone implements x264 in GPU (or Vegas) we'll really see some outstanding performance!
__________________
30 years of pro media production. Vegas user since 1.0. Webcaster since 1997. Freelancer since 2000. College instructor since 2001.
Seth Bloombaum is offline   Reply With Quote
Old June 8th, 2014, 01:07 AM   #10
Major Player
 
Join Date: Apr 2008
Location: Sydney Australia
Posts: 899
Re: GPU versus CPU rendering

Absolutely agree with Seth on his findings. x264 in Vegas would be a fantastic addition. Handbrake for free with x264 is only marginally beaten by Telestream Episode Pro in quality which starts at $1,194.

Check out Jan Ozer's H.264 /x264 comparisons in his PDF called 'Encoding H.264 Video for Streaming and Progressive Download.' It is a mine of information on the whole of the web MP4 delivery scenario. He covers what the mass online market, TV and corporate B2B, is doing in relation to suggested bit rates, pixel dimensions, pixel bit depth, coding methods etc. Studying his PDF along with Telestream’s ‘Encoding Best Practice’ rapidly pointed me in the direction of creating better and improved quality encodes for the web. Just finished 13 ninety minute plus online learning seminar programs for a national medical body and ended up creating Handbrake encoded x264 videos at 768 kbps that were way superior to anything I could put out of Vegas, Mainconcept, TMPGEnc, Sorenson or Compressor. Compressor was so far behind Handbrake at these lower bitrates it wasn't funny.

To really get your head around all the settings in Handbrake watch or download the ten Handbrake ‘how to videos’ from Youtube. Well, well worth it if you want to extract the best out of Handbrake. I am totally convinced a good software encode with totally adjustable encoding attributes is way better than any of the GPU H.264 encodes at lower bitrates. Once you get upwards of 3000 kbps the difference becomes less noticeable. From 5~600 kbps up I don’t think I can really pick much difference. At the lower bitrates it’s like chalk and cheese, the x264 software encodes totally leap ahead of any of the GPU H.264 encodes I have tried at similar bitrates and sizes.

Jan Ozer link:

http://www.google.com/url?sa=t&rct=j...68445247,d.dGI


Telestream link:

http://www.google.com/url?sa=t&rct=j...68445247,d.dGI

Chris Young
CYV Productions
Sydney
Christopher Young is offline   Reply With Quote
Old June 8th, 2014, 09:53 AM   #11
Trustee
 
Join Date: Oct 2009
Location: Rhinelander, WI
Posts: 1,209
Re: GPU versus CPU rendering

Quote:
Originally Posted by Robin Davies-Rollinson View Post
Surely, if it's digital, one can't be better or worse than the other?
It most certainly can.

A CPU can do a lot of things that a GPU cannot do. A CPU puts a few cores on roughly the same amount of silicon that the GPU packs many cores on. To cram up so many cores on a chip, the GPU can only support a limited number of instructions and, in the case of floating-point operations, it has to sacrifice precision.

Unlike integers, floating-point numbers are not represented exactly. They use what is known as the engineering notation in decimal world, which represents a number, such as 123.456789 as 1.23E02, i.e., 1.23 times ten raised to the power of two. Such a notation sacrifices precision to achieve compactness.

In digital, floating point does the same, but it uses a binary point instead of a decimal point. Each digit can only be either a zero or a one (whereas in decimal each digit can be 0, 1, 2, 3, 4, 5, 6, 7, 8, or 9). You need many more binary places in floating point than decimal places in the engineering notation. And a GPU uses considerably fewer binary places than a CPU.

Not only that, but the large number of cores in a GPU forces its designers to use less exact hardware algorithms to perform its math. And they skip many useful math functions altogether. For example, a CPU can natively calculate a sine, a cosine and many other functions (FWIW, a CPU can calculate a sine and a cosine at the same time). A GPU has no native support for this kind of math and has to approximate it by oversimplified software algorithms.

You can only cram so much functionality on a given volume of silicone (or whatever material you use, but silicon is what most, perhaps all, use these days). And since a GPU places considerably more cores on roughly the same amount of silicon that the four or so cores of the CPU do, you have no choice but to reduce the amount and precision of available instructions.

A GPU is optimized for calculating the vertices of a 3D model in order to achieve high speed games, where the precision is not as important as it is in video editing, which generally does not even work with vertices but rather manipulates the colors of individual pixels using sophisticated mathematics. A gamer is too busy concentrating on winning the game, so he does not really pay much attention to how precise the pixels are. As long as the overall image looks casually great, the gamer is happy.

A filmmaker/videomaker cares about every little detail in the image. But the GPU is not designed with a filmmaker/videomaker in mind. Its purpose is to produce impressive games, not impressive movies or videos.

So, yes, from a video editor’s point of view, one not only can be worse than the other, one usually is worse than the other.
Adam Stanislav is offline   Reply With Quote
Old June 8th, 2014, 09:14 PM   #12
Major Player
 
Join Date: Apr 2008
Location: Sydney Australia
Posts: 899
Re: GPU versus CPU rendering

Thx for the real tech explanation Adam. Makes even more sense to me now that I have an idea of what is going on in there.

Chris Young
CYV Productions
Sydney
Christopher Young is offline   Reply With Quote
Old June 8th, 2014, 10:10 PM   #13
Trustee
 
Join Date: Oct 2009
Location: Rhinelander, WI
Posts: 1,209
Re: GPU versus CPU rendering

You are quite welcome, Chris.
Adam Stanislav is offline   Reply With Quote
Old June 8th, 2014, 10:14 PM   #14
Regular Crew
 
Join Date: May 2008
Location: Burlington, Wisconsin
Posts: 137
Re: GPU versus CPU rendering

Thank you for these informative responses. I am amazed at the collective intelligence represented here.
Mervin Langley is offline   Reply With Quote
Old June 9th, 2014, 12:54 AM   #15
Trustee
 
Join Date: Dec 2002
Location: Gwaelod-y-garth, Cardiff, CYMRU/WALES
Posts: 1,215
Re: GPU versus CPU rendering

Thank you Adam too. Very enlightening!
__________________
TV Director / Cameraman
Robin Davies-Rollinson is offline   Reply
Reply

DV Info Net refers all where-to-buy and where-to-rent questions exclusively to these trusted full line dealers and rental houses...

Professional Video
(800) 833-4801
Portland, OR

B&H Photo Video
(866) 521-7381
New York, NY

Z.G.C.
(973) 335-4460
Mountain Lakes, NJ

Abel Cine Tech
(888) 700-4416
N.Y. NY & L.A. CA

Precision Camera
(800) 677-1023
Austin, TX

DV Info Net also encourages you to support local businesses and buy from an authorized dealer in your neighborhood.
  You are here: DV Info Net > Windows / PC Post Production Solutions > What Happens in Vegas...

Thread Tools Search this Thread
Search this Thread:

Advanced Search

 



Google
 

All times are GMT -6. The time now is 01:39 PM.


DV Info Net -- Real Names, Real People, Real Info!
1998-2017 The Digital Video Information Network