DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   AVCHD Format Discussion (https://www.dvinfo.net/forum/avchd-format-discussion/)
-   -   AVCHD video at higher bitrates (https://www.dvinfo.net/forum/avchd-format-discussion/488346-avchd-video-higher-bitrates.html)

Dave Blackhurst December 6th, 2010 10:26 AM

Arkady -
I think you are perhaps missing one aspect that others have mentioned, To quote Yoda, "there is no try, only do or do not" when it comes to playback - if the equipment or media is not up to handling the data stream, it falls apart or crashes entirely.

I think what you're saying is that at a given bitrate, regardless of whether it can be played back, there might be additional image information that can be extracted. My "wrench" post pretty well explains the answer is "maybe", and why that would be true.

Let's say a "raw" data stream would represent for the sake of argument 100Mbps - but you have to get that down to 17Mbps to be able to play it back - there will be a significant reduction required. It the data is sufficiently redundant, that reduction is relatively easy - the more "new" and different data, the more difficult the tack becomes, and the less likely you will find any "restorable" data.

I saw the little bits of noise in the Mosque shot, particularly around the light, that's one of those "artifacts" that personally drives me up the wall - the sky is a smooth tone, and doesn't have a flock of mosquitos. As for the window detail, I thought the 80 Mbs sample was a tad cleaner - this is one of those "judgement calls" on image quality - I've seen people prefer an image with crushed blacks that tend to give a more contrasty overall image, but there is less shadow detail - I myself lean the other way, prefering more gradations of shadow, but the overall image will look less contrasty - just a personal preference.

Arkady Bolotin December 6th, 2010 11:36 AM

Dave:

I agree completely with you – the phrase of mine “If the processing power … isn’t up to the task” is an example of bad wording. It’s supposed to be read as “If the H.264 decoder’s power … isn’t up to the task…”

Philosophically speaking, that entire dispute around high bitrates can be reduced to one question: Given a particular AVCHD-based original video what would be the optimal bitrate for the rendered movie? In other words, how we can choose the right bitrate for rendering?

This is an area open for experiments and taste contests.

Robert Young December 6th, 2010 12:19 PM

Quote:

Originally Posted by Arkady Bolotin (Post 1595482)
Philosophically speaking, that entire dispute around high bitrates can be reduced to one question: Given a particular AVCHD-based original video what would be the optimal bitrate for the rendered movie? In other words, how we can choose the right bitrate for rendering?

It sounds like maybe you are referring to an optimal bitrate for obtaining the best preview imagery within an editing system. That is a reasonable goal, and again there are existing solutions. The most common being to convert the raw AVCHD to a high bitrate DI if needed.
When you say "the optimal bitrate for the rendered movie", this means something different to me.
The movie (edited sequence) is usually rendered to a specific delivery format for distribution- Blu Ray, web, DVD, etc., and those bitrates are determined by the pre existing specs of the various formats.

Arkady Bolotin December 6th, 2010 12:44 PM

Robert:

Yes, industry standards are one thing, but the optimal bitrate is another. Even within the same standard (say, Blu-ray disc), you can choose different bitrates (in the certain range) for rendering.

David Heath December 6th, 2010 01:28 PM

Quote:

Originally Posted by Arkady Bolotin (Post 1595393)
My initial intention was to avoid (by all means) a technical lecture on AVCHD and video compression essentials. Besides, this thread is rather over.

I had intended not to bother with any more answers, but..... well, one last time.....
Quote:

Originally Posted by Arkady Bolotin (Post 1595393)
However, the main tool is Lossless Entropy coding. Entropy coding is the coding technique that replaces data elements with coded representations, which can result in significantly reduced data size without data loss.

Just what evidence can you quote to support the claim that that is the *main* tool? It's certainly not my understanding. It's one tool, but only one amongst many - and most are likely to involve some loss.

Think about it. Uncompressed HD video is around 1Gbs at 8 bit depth. AVC-HD is about 20Mbs (average). That's roughly a 50:1 average compression ratio! There is simply no way any system can achieve that and still be truly "lossless" on normal pictures. It's a tribute that it manages to achieve it at all, without the lost data being visibly missed under normal viewing.
Quote:

Originally Posted by Arkady Bolotin (Post 1595393)
If the processing power (or the memory size) isn’t up to the task (or the bitrate is constrained to a low limit), the fidelity of playback deteriorates (by throwing away some of the macroblocks or even the picture slices). This is, of course, has nothing to do with the faithfulness of the compressed video, which can be restored (decompressed) in some other way.


I hope this crush introduction into coding technique answers all questions raised before in the course of the thread.

No, far from it. The first paragraph implies that whatever bitrate the original video is compressed at, the original can always be reconstructed by a powerful enough computer. If true, the expectation would be that playback quality would vary widely with the power of the hardware it's replayed on.

But that is not the case in practice. Quality is independent of the power of the decoding computer - until a lower threshold is approached, when playback first becomes stuttery, then fails completely. The compression quality is a function of the individual coder and bitrate - not the replaying equipment.

At AVC-HD bitrates, H264 will have to discard information at the coder, and once lost, it's lost.

Arkady Bolotin December 6th, 2010 04:17 PM

Oy vey, David, give me a break…

You discard my statements as baseless and instead offer yours equally deniable and unsubstantiated…

Think about it. The problem here is not the logic or wordplay, we are arguing about something that can be only proved or disproved by experiments and observations.

Neither you nor I possess any real knowledge about what AVCHD coder Sony or any other manufacturer uses in their cameras, what algorithms they apply and what the profile of H.264 they employ, how lossless or lossy the encoded video could be in various situations.

Anyway, thank you for answering. I am glad you are still here.

David Heath December 6th, 2010 05:02 PM

Quote:

Originally Posted by Arkady Bolotin (Post 1595080)
You discard my statements as baseless and instead offer yours equally deniable and unsubstantiated…

Think about it. The problem here is not the logic or wordplay, we are arguing about something that can be only proved or disproved by experiments and observations.

OK - it's an observation that a computer with a low power won't playback AVC-HD at all. Increase the processing power, and at a certain threshold it will start to play with stuttering, then with a little more power play cleanly. (I think Dave Blackhurst confirms this?) No matter how much more powerful a processor you then put on the task, it won't make any better job of the decoding quality wise - the coding quality is a function of bitrate for any given coder.

Try that as an experiment for yourself. Play your samples on a range of computers with different powers and see if you notice any quality difference.

That's why I feel my statements are not unsubstantiated, and ARE based on observations. (As well as theory.)

I'll let everybody else make up their own mind.

Arkady Bolotin December 6th, 2010 07:03 PM

You said “No matter how much more powerful a processor you then put on the task, it won't make any better job of the decoding quality wise - the coding quality is a function of bitrate for any given coder”.

But this is an exact confirmation of what I did: I made two clips in the different video formats – AVCHD (M2T) and VC-1 (WMV) – at the different bitrates – 24 MBps and 80 MBps respectively – and showed that “the coding quality is a function of bitrate”, i.e. that the quality of the 80 MBps WMV clip is higher.

Look, David, we can do it eternally: your argument – my argument and so on, and so on…

I think we should stop right here. I am sure: everyone has made up his mind long time ago.

Dave Blackhurst December 6th, 2010 09:57 PM

There are TWO variables here - as we are speaking of, the horsepower and hardware, AND the encoding bitrate. If something is recorded at a lower bitrate, it'll play back on lower grades of hardware, but it will not look as good as a higher bitrate stream, and the likelyhood of ANY extra detail being available to extract is pretty small.

Based on what I understand of compression, it should, under some circumstances be possible to pull additional detail out of a decompressed stream, and in theory use a higher bitrate to "keep" that detail in a more intact form.

I think the main problem with the theory is that one needs to know how well movement in the frame is dealt with before you could even know whether the endeavor is worth the hassle, my suspicion is that the specifications for the standard are set up in such a way as to be as "gracefull" as possible handling detail and movement with "average" hardware, under "average" shooting conditions, and under those "average" conditions, the possible improvement would be minimal.

Arkady Bolotin December 8th, 2010 04:40 PM

AVCHD video at higher bitrates revealed
 
To decipher the “mystery” of more-detailed-appearance of the high-bitrate clips rendered from original AVCHD video two words have a very important meaning. Those words are transcoding and interpolation.

Transcoding is the conversion between different encoding formats. Naturally, transcoding between two lossy formats – be it decoding and re-encoding to a different codec or to a different bitrate – causes generation loss (i.e. loss of the video quality).

However, if lossy-compressed video is transcoded to lossless or uncompressed one, then it can be technically qualified as a lossless conversion because no information would be lost.

In practical terms, it means that transcoding video from the AVCHD@24 MBps format to the VC-1@80 MBps format shouldn’t produce (perceptible) video loss. In other words, the original video clip and the transcoded one must have the same video quality.

So, why the VC-1 clips do appear as more detailed? It is so due to the bicubic interpolation utilized in the VC-1 video codec. By constructing new sub-pixels, the bicubic interpolation brings about the appearance of fractionally filled pixels (i.e. smoothness) in the picture. In other words, the bicubic algorithm (which is lacking in the AVCHD format) increases apparent sharpness (i.e. detail level).

The downside of the bicubic interpolation is that it reduces contrast: that is why the rendered clips look less contrast.

Thus, transcoding video from the AVCHD@24 MBps format to the (more advanced) VC-1@80 MBps format is merely a video enchantment. That’s it, nothing more, nothing less.


POST SCRIPTUM:

Robert and David, I think I owe you both a great deal of apologies.

David, I feel very sorry. You were right, I was wrong. Instead of paying attention to your arguments I drowned them deep into empty demagogy.

My sincere apologies go also to everyone who felt mystified by those high-bitrate clips. I was bewildered by them as well. But this is called being human.

Robert Young December 11th, 2010 02:10 AM

Arkady
One of the very useful purposes of these fora is to stimulate interesting discussion.
I can't help but notice that your original post has generated 4 pages of discussion and more than 1,000 viewers- a sure sign you hit a sweet spot!
When one notices something interesting, intriguing, or unusual, you can hardly go wrong by presenting it here.
It's not so important to be right or wrong- it's the challenge of examining an interesting topic that makes it fun and thought provoking for everyone :)

Ron Evans December 11th, 2010 09:52 AM

Arkady
I echo Robert's post. It has been a very interesting discussion and that is why I read the forum. Keeps my retired brain active !!!

I look forward to your next discussion.

Ron Evans

Phil Lee January 8th, 2011 04:42 AM

Hi

I was just reading this with interest.

I think what might be overlooked is the conversion to higher bit-rates seem to also swap to a different codec. What is being observed is probably differences with the types of de-interlacing taking place depending on what decoder is being loaded and used and how much CPU might be left over. Certainly I always find MPEG2 at Blu-Ray bit-rates compared to H264 at the same bit-rate, has the MPEG2 looking more watchable on the computer, even though H264 is more efficient and can do even more with the high bit-rate.

Maybe a better test would be to use progressive video to rule that out, but you also can't rule out a different decoder adding a bit of sharpening or other processing, or the placebo effect, which might also be a big part of this.

The acid test is to export frames with lots of detail from an editor (you can pick the exact same frames then) and flick between them as static images. Using Alt-Tab you can do this very easily on a PC, and I've taken snapshots from the original AVCHD footage, from footage exported as uncompressed video, and from the footage converted to high bitrate MPEG2 and H264, and they have looked exactly the same!

Regards

Phil


All times are GMT -6. The time now is 06:54 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network