DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Sony XDCAM PXW-FS7 / FS5 (https://www.dvinfo.net/forum/sony-xdcam-pxw-fs7-fs5/)
-   -   10 Bit vs 8 Bit - Sony FS5 vs Sony a7Sii (https://www.dvinfo.net/forum/sony-xdcam-pxw-fs7-fs5/531768-10-bit-vs-8-bit-sony-fs5-vs-sony-a7sii.html)

Nigel Davey May 4th, 2016 04:40 AM

10 Bit vs 8 Bit - Sony FS5 vs Sony a7Sii
 
Interesting video/conclusions. Wonder how much it plays out re 8 bit 4K on a FS5 vs 10 bit 4K on a FS7?


David Heath May 4th, 2016 05:18 PM

Re: 10 Bit vs 8 Bit - Sony FS5 vs Sony a7Sii
 
The big problem is they are far from like for like tests. To be scientific, any test should really only change one variable at a time - so to sensibly draw valid conclusions about 8 v 10 bit working the ONLY thing that should change is the bitdepth. Which is not the case here. Yes, one example may be 8 bit and the other 10 bit - but with different cameras, different codecs, etc etc how can anyone draw conclusions that any observed difference is down to bitdepth and not other factors?

Simple answer is you can't. It's also the case that bitdepth issues CAN indeed cause banding (especially chroma banding), but banding can also be caused by compression issues. So it's not valid to automatically blame 8 v 10 bit for any flaws seen. (If you want a very simple validation of that, try saving a gradient in Photoshop at the minimum JPEG quality setting. You'll see significant banding, independent of bit depth.)

And the whole point is that often it's the bandwidth that is constrained. So if you have to code at (say) 50Mbs, then for all else equal 10 bit coding will mean a higher compression factor than 8 bit. So *CONCEIVABLY* any gain due to 10 bit may be outweighed by a loss due to higher compression.

In the examples here, then whilst one difference may be 8 v 10 bit, there are other differences as well. So many of the conclusions are pretty meaningless as presented.

Mark Rosenzweig May 4th, 2016 08:43 PM

Re: 10 Bit vs 8 Bit - Sony FS5 vs Sony a7Sii
 
Quote:

Originally Posted by David Heath (Post 1913928)
The big problem is they are far from like for like tests. To be scientific, any test should really only change one variable at a time - so to sensibly draw valid conclusions about 8 v 10 bit working the ONLY thing that should change is the bitdepth. Which is not the case here. Yes, one example may be 8 bit and the other 10 bit - but with different cameras, different codecs, etc etc how can anyone draw conclusions that any observed difference is down to bitdepth and not other factors?

Simple answer is you can't. It's also the case that bitdepth issues CAN indeed cause banding (especially chroma banding), but banding can also be caused by compression issues. So it's not valid to automatically blame 8 v 10 bit for any flaws seen. (If you want a very simple validation of that, try saving a gradient in Photoshop at the minimum JPEG quality setting. You'll see significant banding, independent of bit depth.)

And the whole point is that often it's the bandwidth that is constrained. So if you have to code at (say) 50Mbs, then for all else equal 10 bit coding will mean a higher compression factor than 8 bit. So *CONCEIVABLY* any gain due to 10 bit may be outweighed by a loss due to higher compression.

In the examples here, then whilst one difference may be 8 v 10 bit, there are other differences as well. So many of the conclusions are pretty meaningless as presented.

Did you watch the video? He recorded videos from each camera into an external recorder. So, the compression used was exactly the same, as was the color sampling 4:2:2. I agree that there may be other differences than bit depth between the different cameras that emerge from their HDMI/SDI ports, but codec was not one of those differences in at least part of this test. He also pointed out that compression matters, in his brief comparison with the C100, as well as ISO. Please watch the video.

Christopher Young May 4th, 2016 10:10 PM

Re: 10 Bit vs 8 Bit - Sony FS5 vs Sony a7Sii
 
2 Attachment(s)
Alister Chapman wrote a fairly concise overview of the 8 vs 10-bit encoding noise versus signal some years back. I couldn't find a link to Alister's page on XDCam User but did find another link to it via HD Warrior. This is exactly how I have always understand the facts.

HD Warrior » Blog Archiv » 8bit or 10bit that is the question

The Signal to Noise Reduction figure of the sensor's A/D converter output and the camera's electronic circuit is also an important factor in determining just how well a camera will reproduce noise free images be it 8 bit or 10 bit. To be able to reproduce a clean full 10-bit signal the SNR of the camera needs to be close or better than 60dB. There are very few cameras delivering theses sorts of SNRs across all channels RG and B.

Assuming the image could be read from the sensor with no other noise, an A/D converter’s
resolution can be determine from the following table.

Given the SNR of the A7S II is in the low 40s dB there would be no real advantage whatsoever of of encoding the A7S signal as a 10-bit image as you would be encoding a lot of unecessary noise which would not be included in the encoding when encoding to an 8-bit signal with it its maximum dynamic range of 48dB.

Another thing that is obviously going to help the image noise figure in Dougdale's tests is the fact that he is dropping a UHD image on a 1080 timeline which creates a reduction in visible noise levels by a factor of four. At a lowly 4K the video image of an A7S II looks amazingly good for an 8-bit camera. Put the A7S II up against some of its competitors in high resolution still images well in excess of 4K and then you see its shortcomings in relationship to SNR with its color depth and dynamic range falling behind even the old Canon 5D Mk III

Sony A7S II review: Lab tests: Sony A7S II signal to noise ratio | TechRadar

Sony A7S: Best Low-Light ISO score in our database - DxOMark.

Not withstanding all of the above I am extremely tempted to go for an A7S II as in the real world it delivers very satisfying images price for performance.

Chris Young
CYV Productions
Sydney

David Heath May 6th, 2016 05:44 PM

Re: 10 Bit vs 8 Bit - Sony FS5 vs Sony a7Sii
 
Quote:

Originally Posted by Mark Rosenzweig (Post 1913936)
Did you watch the video? He recorded videos from each camera into an external recorder.

Yes, I did watch the video, and have just watched it again to be sure.

It's quite true that in the first section a separate recorder was used, so yes, the codec gets taken out of the equation there - but later (when a C100 and Red are used) it seems the internal recording of these is used? Similarly later on, the tilts up to sky, it's specifically stated internal recording is used.

And the point I originally made was ".......with different cameras, different codecs, etc etc how can anyone draw conclusions that any observed difference is down to bitdepth and not other factors?" Perhaps it would have been better to say "different cameras AND/OR different codecs...." - even if codec is the same, inherent camera differences are still there.

Yet the whole point of the talk is "8 bit versus 10 bit". But it's more than likely that many of the differences seen are down to differences inherent between the cameras - not simple 8 versus 10 bit.

Quote:

Originally Posted by Christopher Young
Alister Chapman wrote a fairly concise overview of the 8 vs 10-bit encoding noise versus signal some years back. I couldn't find a link to Alister's page on XDCam User but did find another link to it via HD Warrior. This is exactly how I have always understand the facts.

HD Warrior » Blog Archiv » 8bit or 10bit that is the question

The Signal to Noise Reduction figure of the sensor's A/D converter output and the camera's electronic circuit is also an important factor in determining just how well a camera will reproduce noise free images be it 8 bit or 10 bit.

Yes, exactly so, and Alister there states it all pretty well. (Original article at The 8 bit or 10 bit debate. | XDCAM-USER.COM if you're interested.) He also makes the point about 10 bit requiring a higher bitrate than 8 bit for *compression parity* (all else equal). The bullet point in the video at the start of this thread about "10 bit will always be better than 8 bit" simply isn't true as stated. Something like "if matched with a corresponding increase in bitrate" needs to be added. (And assuming same coder and codec.)

The other point Alister gets right is regarding noise. Hand in hand with the paragraph above, 10 bit may give a better result with a very noise free original signal - but much less in the presence of noise. In some circumstances 8 bit will actually give a BETTER result than 10 bit for a given bitrate - the available data is used to give lower compression, rather than greater bitdepth.

Nigel Davey May 14th, 2016 03:00 PM

Re: 10 Bit vs 8 Bit - Sony FS5 vs Sony a7Sii
 
Another FS5 comparison from Mr Dugdale:


All times are GMT -6. The time now is 03:38 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network