View Full Version : Are broadcast video standards ideal for YouTube playback?
Scott Wilkinson November 12th, 2012, 08:15 PM Yet another debate in our production facility is over how to create the best picture for YouTube-only playback. Many of our videos will literally never be seen on broadcast television---nor will they ever be seen on any type of DVD. They are YouTube only---that's it. No repurposing.
Given this, it seems to me that we should not (for the YouTube-only videos) be constrained by broadcast standards (e.g. where white levels and black levels are concerned). Our only constraints should be identical to any that may exist for editing digital still images in Adobe Photoshop/Lightroom/Bridge (because we are talking about a 100% non-broadcast, computer-only environment).
Many of my producers come from broadcast backgrounds, and seem entrenched in broadcast video standards and are reluctant to deviate from them. Yet I can often grab a frame from one of our videos, open it in Photoshop or Lightroom, and make it look better. In many cases all I do is tweak levels (not broadcast video levels, but "levels" as in Photoshop's Levels tool).
The other issue is the known gamma difference between Mac and PC displays. Again, since many of our videos are ONLY seen on computers (and never broadcast)...it seems that we should be taking this Mac/PC gamma difference into consideration.
I'd appreciate thoughts from others on how you deal with this. Even though broadcast standards were developed before people routinely watched videos on their desktop/laptop computers or smartphones or tablets...do you still believe broadcast standards should be adhered to when mastering video that will never be broadcast?
My producers say yes---and they say broadcast standards offer the only consistent baseline for production. My opinion, though, is that even if broadcast standards offer a baseline for consistency...that consistency does us no good if our videos are consistently a bit off when viewed on someone's desktop computer display.
Scott
Battle Vaughan November 12th, 2012, 08:47 PM I think YouTube is saying what I saw once posted on a vending machine ---"Please try our way first!"
Here's a link to YouTube's professional standards sheet:
Advanced encoding specifications - YouTube Help (http://support.google.com/youtube/bin/static.py?hl=en&guide=1728585&topic=1728573&page=guide.cs)
The broadcast standards, iirc, are set by legislation so that the black and white levels will be contained in the fm frequency envelope that is assigned to the broadcaster. High whites can push the fm frequency out of band. So the constraints in that sense have nothing to do with digital video posted in a totally different medium. Horses for courses.
Scott Wilkinson November 13th, 2012, 07:30 AM @Battle: thanks---you stated what I believe---that broadcast standards have nothing to do with delivery via YouTube to computers, laptops, smartphones, tablets, etc.
I checked out that link you provided---which is useful! But YouTube's own standards say nothing about color and luminance, or levels for either. This is undoubtedly due to the enormous variety in displays where their videos are seen.
I can understand why my producers (and some others) may assert that in the absence of any kind of color/luminance standards for YouTube delivery, sticking to broadcast standards makes the most sense (because it's the only standard there is).
But my belief is that there *is* a standard of sorts already out there that can easily be applied to YouTube videos: the standard by which every professional digital photographer judges their digital still images (and the techniques with which digital still images are manipulated in Photoshop, Lightroom, Aperture, Bridge, etc.)
There are plenty of videographers who came from a digital still image background (and made the move to video through DSLR's). For these folks, it's perfectly natural to apply the same standards to their video that they did when shooting stills.
But it seems to me that making such a leap is far more difficult for video people who come from a broadcast background---because their entire "world view" is that of broadcast video.
This is the challenge I'm confronted with---getting braodcast video people to (temporarily, at least) set aside their broadcast dogma and embrace digital image manipulation in apps intended for that purpose (and with similar tools that Adobe built into Premiere Pro).
Scott
Scott Wilkinson November 13th, 2012, 12:19 PM So an open question to everyone: if you're producing a video that will ONLY ever be seen on YouTube, do you adjust the image (brightness, color, contrast, etc.) to where you think it looks good (on your computer display)---with no consideration for broadcast standards?
Or do you get anal with a vectorscope and waveform monitor?
Just curious. :-)
Scott
Scott Wilkinson November 13th, 2012, 03:58 PM Wow...150 views and not much feedback. This seems to underscore what I've suspected---that there are no mastering standards for purely web-based video whatsoever (beyond what our eyes tell us while looking at it on a variety of computer displays).
As I've suggested above, I think the standards that make most sense for web video are identical to those used for digital still image manipulation---and (in my opinion) broadcast standards have nothing to do with it. (Again---for video that will NEVER be broadcast on television---and there is a LOT of this kind of video today.)
And keep in mind that when I discuss standards, I'm not talking about encoding standards (there are plenty of those!). I'm talking about what might be called "color grading for YouTube." :-)
For what it's worth, we routinely produce videos that---if we were in the private sector---would have $50,000 production budgets. And these videos will never be seen on broadcast television. Is that shortsighted? Not at all---we reach our audiences where they look for us---and that is not on television.
Scott
David Heath November 13th, 2012, 04:49 PM So an open question to everyone: if you're producing a video that will ONLY ever be seen on YouTube, do you adjust the image (brightness, color, contrast, etc.) to where you think it looks good (on your computer display)---with no consideration for broadcast standards?
Or do you get anal with a vectorscope and waveform monitor?
Personal feeling is that vectorscopes and waveform monitors as such were tools for the analogue era, especially vectorscopes. Most of their usage went away with the move from composite to component.
That said, the real issue with video is that you don't neccessarily know what's going to happen to it further down the chain. And the danger is that it may look OK on your screen, but something happens later which causes all sorts of problems. Your setup may not be clipping, say, but further down the chain......
And that's the point about broadcast standards. To set absolute limits to try to guarantee interchangeability.
They relate to quite a number of factors, and such as aliasing may be just as applicable to Youtube video as broadcast. Big issue with aliases is that they can make encoders less efficient, even if they're barely visible in the pictures themselves. So to get equivalent quality quality you have to encode at a higher bitrate with an alias prone camera.
Brian David Melnyk November 14th, 2012, 02:59 AM to use an audio analogy, i mix my music on reference monitors, but also listen to the mixes on computer speakers, car speakers, ghetto blasters, headphones etc. etc. to make sure the mix translates as universally as possible.
when i color grade i view the scopes and the image on HDTV and also computer monitor and try to get an image that translates well on both. without scopes, i feel like i'm flying blind. not sure if this is a good technique, but i like to try to adhere to broadcast quality standards even if it is just for the net. maybe i am just compromising both images?
another (perhaps more appropriate) audio analogy is that mixes used to only have so much bass, otherwise the needle would skip, which is now a pretty irrelevant standard (well, except for the vinyl revival...)
Scott Wilkinson November 14th, 2012, 07:14 AM If you'll allow me to hijack my own thread for a minute...
I think the biggest issue here is black and white levels. Even in an all-digital workflow (with the end product being viewed ONLY on YouTube), many producers still insist on adhering to broadcast standards (either IRE 7.5 blacks or keeping the range between 16 and 235).
My point is that in an all-digital workflow---when no analog devices nor broadcast television are not anywhere in the stream---it is silly to limit black and white levels to 16 and 235. My point is that for videos intended strictly for YouTube delivery, we should be using super-blacks (0-15) and super-whites (236-255). Not using this full range (in my opinion) is reducing the potential quality of our all-digital videos.
Professional digital photographers, when editing their images in Photoshop, don't clamp their blacks to 16 and their whites to 235. We shouldn't be doing that either.
To my eye, the absence of "super-blacks" and "super-whites" in a YouTube video is very noticeable. When the levels are clamped to broadcast standards, YouTube videos look damp, subdued, and lack vividness and vitality.
Any comments? :-)
Scott
Roy Feldman November 14th, 2012, 09:17 AM It is my experience that YouTube "expects" a video that is made to the .709 spec, when I feed it a 0-255 file it increases the contrast, but when I feed it a .709 it expands to show a 0-255 (almost). It would be very interesting to see how far the dynamic range could be pushed. Most NLE's export to the .709 standard but I know that Sony Vegas doesn't unless set that way, making some very contrasty Youtube presentations.
At the risk of being redundant I can definitely see a gamma shift done by YouTube to "correct" a broadcast spec file to broaden the range, I would love to be able to have insight on how this works, what standard do they use and why can the same file look so different in different browsers.
I hope this discussion carries forth.
Ross Zuchowski November 14th, 2012, 10:00 AM I suppose as long as you're using a calibrated monitor to do your color correction there won't be an issue in delivering these "super" levels your discussing.
I guess my question is that doesn't YouTube re-compress your uploaded video anyway? If so wouldn't this crush your super levels. Sorry to be naive on this, just asking.
There may be more to consider as well:
http://avisynth.org/mediawiki/Luminance_levels
http://avisynth.org/mediawiki/Colorimetry
Seth Bloombaum November 14th, 2012, 10:15 AM I think the biggest issue here is black and white levels. Even in an all-digital workflow (with the end product being viewed ONLY on YouTube), many producers still insist on adhering to broadcast standards (either IRE 7.5 blacks or keeping the range between 16 and 235).
...My point is that for videos intended strictly for YouTube delivery, we should be using super-blacks (0-15) and super-whites (236-255). Not using this full range (in my opinion) is reducing the potential quality of our all-digital videos...
And there's your answer. Color space is the primary "standards" difference (see below for some others). To apply the broadcast range of 7.5-100 IRE to clips for online distribution is to sometimes lower contrast, with no true blacks or whites. However, to call the extended range "super-blacks, super-whites" isn't descriptive outside the NTSC and ATSC worlds. That is, RGB 15-15-15 isn't "super-black", it's dark gray for purposes of online distribution.
It is my experience that YouTube "expects" a video that is made to the .709 spec, when I feed it a 0-255 file it increases the contrast, but when I feed it a .709 it expands to show a 0-255 (almost)...
This behavior varies from codec to codec. What Youtube does is frequently mysterious, but they're not alone in expecting that some delivery codecs conform to Rec. 709, and some to a 24-bit RGB color model. I understand that versions of the same codec from different manufacturers vary in this, and some players may interpret incorrectly.
There are no standards in online distribution. There is no controlling organization. Different codec and player developers have done what they thought was best, and almost none document it.
IMO we have to look at this differently. Develop a workflow and test it, end-to-end. That is, create test media, take it through our finishing process, encode it for distribution, and measure what eg. Youtube does with it.
Other differences from broadcast standards:
Online viewers see your work wall-to-wall. You should adjust headroom accordingly, but have a lighter hand with lower-third titles. Basically, title-safe goes to the edge of the frame, but we've become accustomed to certain graphic composition conventions. But headroom should change.
All display is progressive. Plan accordingly. Usually, the earlier your project becomes progressive, the better. In-camera, for example.
High-motion stresses encoders and frequently produces frame-drops. Shaky-cam techniques and video noise are the primary concerns, but broadcasters usually aren't trying to put up hand-held cell phone concert footage. Broadcasters usually use tripods & lighting... Even this becomes less of a concern as codecs and bandwidth available to viewers continue to improve.
David Heath November 14th, 2012, 05:32 PM My point is that in an all-digital workflow---when no analog devices nor broadcast television are not anywhere in the stream---it is silly to limit black and white levels to 16 and 235. My point is that for videos intended strictly for YouTube delivery, we should be using super-blacks (0-15) and super-whites (236-255). Not using this full range (in my opinion) is reducing the potential quality of our all-digital videos.
Any comments? :-)
Scott
Theoretically, you are absolutely correct - "super-whites" and "super-blacks" are a legacy issue from when digital video was analogue video encoded - complete with allowance made for sync pulses and subcarrier above white.
In this respect they are like lots of other things - drop frame timecode and the 29.97 fps frame rate being most notable in 60Hz territories. That's left over from the earliest days of NTSC television and stopping the new-fangled subcarrier from causing a buzz on some early monochrome sets.
But ignore legacy at your peril. You'd not have "super-whites" and "super-blacks" if you were starting from scratch, fully agreed, but the danger is that if you just ignore them your product may get blacks and whites severely crushed.
Phill Pendleton November 14th, 2012, 06:27 PM This may be a bit off topic and forgive me for my ignorance but I can undertand Ex Broadcast Producers wanting to stick to a standard.
As a cameraman I'm more in the aquisition rather than post area of video standards.
No matter what the project is, I've always shot everything with broadcast as the minimum standard. Even for DVD, web, youtube I've stuck to this known standard as a way to future proof my work. You never know where the work will end up.
Who knows what Youtube will be like in 12 months. Technology moves quickly and one needs to be ready for changes. Would it be crazy to think that Youtube might be broadcast standard in a few years time?
Presently editors are tweeking my footage to have the best result on youtube. I know though that footage can be retweeked for any future changes.
Roy Feldman November 15th, 2012, 08:43 AM YouTube itself now allows you to tweak fill light, contrast, saturation and white balance, not with the precision most of us are used to, just sayin...
Keith Dobie November 17th, 2012, 12:07 PM Anyone know someone from Google / YouTube who could answer some of these questions? How about Vimeo?
It's easy to download converted versions of your own videos from YouTube or Video, so would be interesting to do some testing and see exactly what their video processing does.
-----
To download a video from your own YouTube channel:
- Click on your username (top right)
- Click on Video Manager
- For each video you'll see an Edit button. Click on the down arrow to its right and select Download MP4
|
|