DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   What Happens in Vegas... (https://www.dvinfo.net/forum/what-happens-vegas/)
-   -   Vegas Pro... 8bit vs. 32bit Samples... (https://www.dvinfo.net/forum/what-happens-vegas/103298-vegas-pro-8bit-vs-32bit-samples.html)

Joe Busch September 11th, 2007 07:20 AM

Vegas Pro... 8bit vs. 32bit Samples...
 
I just downloaded the Vegas Pro trial... I haven't seen a huge performance increase, but the it seems much more stable...

I did a little test to see the difference between the 8bit and 32bit.

I found I'd rather edit in 8bit and color correct in 8bit... rather than edit in 32bit... I get roughly 10-11fps playback at Full HD (and converting from 60i to 30p on the fly) vs. 1.5fps playback at 32bit... so 10% of the performance for slightly more accurate colors?

http://www.lousyhero.com/pics/dvi/8bit.jpg
http://www.lousyhero.com/pics/dvi/32bit.jpg
http://www.lousyhero.com/pics/dvi/8bitcc.jpg

First is 8bit
Second is 32bit
3rd is 8bit with a slight color correction filter dropped on (not tweaked)

Chris Barcellos September 11th, 2007 09:47 AM

Joe:

Conclusion ? 32 bit looks nice, but the color corrected version looks as good,too, IMHO.

What is difference in render times between the various scenarios ?

John Miller September 11th, 2007 09:59 AM

Looking at the luma histograms, the 32-bit ranges from 0 - 255 where as the 8-bit ranges from approx. 16 to approx. 250.

Is this an IRE setup thing + conforming (loosely) to the usual 16 - 235 range?

As a result, the 8-bit does look somewhat less vibrant.

The histogram for the color corrected version shows the black clipping strongly.

Of course, looking at an RGB bitmap can be misleading, anyway. The proof of the pudding is really in the YCrCb channels as far as validity of the pixel values is concerned.

Danny Fye September 11th, 2007 10:35 AM

Problem I have is while 32 bit looks better than 8 bit on the timeline it looks worse when rendered. Colors were blah and the black levels were wrong. Looks like I it would when one went from a 3 CCD SD cam to a single CCD SD cam.

Also when rendering 16x9 SD to mpeg 32 bit took 3 times longer and was not as good.

When rendering to wmv the quality was about the same but it took 4 times longer to render.

There may need to be a VP8-B or C to adjust how 32 bit renders?

Oh Yea, 32 bit makes previewing way too sluggish. I have a newly built fast system with quad core, 4 gig of ram and a good video card. I think it may be a while before I want to use the 32 bit thing...

Danny Fye
www.dannyfye.com
www.vidmus.com/scolvs

John Miller September 11th, 2007 10:43 AM

Can you disable 32-bit floating point (fp) processing in Vegas 8?

Whilst developing our software, I toyed with 32-bit fp vs. integer and found that it was slower for the same calculations. Primarily, this is because it takes a relatively long time for the CPU to convert to and from fp.

Glenn Chan September 11th, 2007 01:03 PM

Yes you can disable 32-bit floating point processing in Vegas. Keep the project in 8-bit bit depth.

Some of the filters process in 32-bit float internally (in 8-bit projects).

2- Vegas has two different 32-bit modes:
A compositing gamma of 2.222. This behaves like old school Vegas.

A compositing gamma of 1.000. All the values are converted from gamma corrected values to linear light values for processing. This is slower. Filter output is different.

3- John, check your private messages... I'd like to bounce some computer programming stuff off you.

Michael Wisniewski September 11th, 2007 02:08 PM

Thanks for the comparison Joe, would love to see the 32-bit CC to round it out.

Jon Fairhurst September 11th, 2007 02:29 PM

Glenn,

That's critical information (1.0 vs 2.222 gamma).

If the gamma, gain and offset are properly set, the only differences between 8-bits and 32-float would be:

1) Less contouring and shallow gradients,
2) Less noise when between levels, and
3) Slower processing.

Oh well, you can't have it all...

That said, it might take a bit before we figure out that "properly set" part.

Joe Busch September 11th, 2007 04:12 PM

Quote:

Originally Posted by Michael Wisniewski (Post 742674)
Thanks for the comparison Joe, would love to see the 32-bit CC to round it out.

It may be better, but I think the point is, you can color correct 8bit to look better than 32bit and still have it preview at decent frame-rates.

All of my HDV stuff has that weird grey overlay over it... and I assume it's from lack of color-depth... I drop some color correction filters, and tweak them if I have time... I'm not overly concerned about accurate colors, I just want it to look better.

Michael Wisniewski September 11th, 2007 04:51 PM

Heh, heh. The question in my mind is - will color corrected 32-bit / color corrected look better than 8-bit / color corrected. I'd do some testing myself, but I'm having issues getting V8 Pro to stay up and running on my system.

Glenn Thomas September 12th, 2007 04:09 AM

Sorry, I just found this thread.. So Vegas 8 just converts 8bit HDV footage to 32bit? I thought you would need to capture your clips to a higher bit rate in the first place using something like an intesity card and maybe the 10bit Cineform Neo HD codec, for 32bit to actually make any difference?

Dan Keaton September 12th, 2007 07:13 AM

I do not have my copy of Vegas Pro 8 yet, so I have not done any testing.

Floating Point speeds on different hardware (different computer CPUs, different brands (AMD, Intel, others) is dramatically different.

So, it will be appropriate to post specific computer details with each post referring to performance timings.

Alastair Brown January 25th, 2008 04:33 AM

2 Attachment(s)
I mistakenly stumbled into this last night after setting my properties to 32bit instead of 8. Took 6hrs 3mins to render a 55min sequence of HD on a Quad Core with 4Gb of RAM. Once I sussed out what I had done, I had a look at the difference and was pretty surprised.

The "grey cast" that Joe mentions in his second post is plain to see in the Comparison 2 grab.

Did some digging and found this thread.

Decided to try some color correcting (Levels with some tiny tweeks to Input start and Gamma) to try and match the 8bit to the 32bit and got "close but no cigar". Seems to have a little less gloss and the whites don't "pop" as much.

Would be interested to hear what the "Big Guns" would suggest to get a better match.

Alastair Brown January 25th, 2008 06:35 AM

2 Attachment(s)
Just done a side by side of Joes pics as well for ease of viewing.

Bill Ravens January 25th, 2008 08:54 AM

I think you guys are a little confused. Vegas doesn't convert 8-bit data to 32-bit. Universal law-you can't get something from nothing. What Vegas does do is process(as in calculate) the data with 32 bit floating point math precision , instead of 8bit fixed point math. Yes, you get more accuracy(as in decimal places) and less banding.

also, the color bandwidth you see out of a render is dependent on a whole lot of things besides 8-bit or 32 bit processing. Are your monitors calibrated? What codec did you render and decode with? All of these things will affect the black(shadow) and white(hi-lite) values you are looking at. 32-bit seems to affect that because it changes the way a codec handles the Y'PrPb conversion to RGB and back. Each codec is different. You can't decide the effect you see, haziness or crushed shadows, is the fault of 32-bit processing alone.


All times are GMT -6. The time now is 02:08 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network