Superwhites? at DVinfo.net
DV Info Net

Go Back   DV Info Net > The Tools of DV and HD Production > Open DV Discussion

Open DV Discussion
For topics which don't fit into any of the other categories.

Reply
 
Thread Tools Search this Thread
Old July 11th, 2005, 08:11 PM   #1
New Boot
 
Join Date: Jun 2005
Posts: 6
Superwhites?

hi. i'm a stills guy. That is to say, i have lots of experience with still photography, 35mm and digital. and i use photoshop to edit stuff. and i know how to use photoshop levels to make a picture pretty. when i go near premiere or finalcut levels my skin starts to melt and i retreat to the darkness...

can anybody explain all the scopes and available picture control we have to deal with here?

is this because you are modifying the NTSC waveform or is it just because of the way the CCD captures video or... what the heck is going on?

list of stuff i don't get:

super-whites... what are they?
something that's labeled IRE--YC waveform?
that wierd circular thing--vectorscope?
the "YPbCr parade" thats like the labels for component video right?
what are the units, etc for the RGB parade

one thing i dont even understand in still:
gamma?

what do the black and white input levels affect? and what do they affect in each color?

is there any way to control mid-tones?

i still have yet to fully play around with premiere but i assume the learning curve is much steeper than that of photoshop.

any help would be appreciated :)
Matt Schoen is offline   Reply With Quote
Old July 11th, 2005, 09:09 PM   #2
Inner Circle
 
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
From what I understand of DV (this is my coles notes version):

DV is captured in Y Cr Cb form, which is sometimes (incorrectly) called YUV.
Explanation of Y Cr Cb: http://www.fourcc.org/fccyvrgb.php#mikes_answer

Most NLEs convert the video into RGB color space before rendering filters, instead of working in YCrCb ("YUV") color space.
An example of RGB values would be the ones you see in Photoshop.

Ok.

For NTSC composite signals, some colors are illegal. Maximum white is defined as 100IRE (IRE is an analog value). Maximum/darkest black is defined at 7.5IRE (0IRE for Japan; PAL is a different format, and 0IRE is black for PAL).

When going from DV --> analog, 235 235 235 (RGB, a digital value) is supposed to translate to 100IRE (analog). 16 16 16 (RGB) is supposed to translate to 7.5IRE (analog), although lots of consumer and "prosumer" equipment puts it at 0IRE (which is wrong).

Anything over 235 235 235 is a superwhite value. It's brighter than it's supposed to be. In post, you can swing these values into the legal luminance range, or clip them (or not, which can be ok in some cases).

Waveform monitor in editing programs:
It shows how your video will correspond to analog values. CAVEAT: If your equipment puts 16 16 16 (RGB) at 0IRE, then the vectorscope in Premiere can be wrong.
Some editing programs like Vegas can toggle between putting 16 16 16 (RGB) at 7.5 or 0. For Vegas, read the manual.

The other illegal colors:
The way NTSC signals work, the color information is piggybacked on top of the luminance information. This way, old black & white TV sets just see the luminance information and work properly, while color TV sets get color. Back in the day, that solved compatibility issues.

Anyways, a limitation of this system is that certain colors cannot be transmitted because they will cause a buzz in the audio (and in even worse cases, cause damage to transmission equipment). The color information combined with luminance information can cause voltages to go too high or too low (in which case they'd interfere with the sync signals). 133IRE is too high, and -20IRE is too low. Many broadcasters won't let signals get near 133IRE- they will either clip your signals to their specs, or reject your master. 115IRE is about the highest you should go, although this varies from station to station.
To see this stuff, flip the WFM into composite mode.
To make signals legal for overly bright+saturated colors, you need to lower brightness, lower saturation, or both. Changing hue slightly may also help??
You generally won't have problems with signals that get too low.

Chroma also can't exceed a certain point. On the vectorscope, the colors are too saturated if they fall outside the biggest circle. You generally won't have a problem with this.

Reading a vectorscope:
As colors get closer to the edge, it means they are more saturated. Rotation around the center is for hue.
There are labelled targets for certain colors.
A vectorscope can be a helpful tool in color correction.

Reading a WFM:
Take a column of your image. The WFM takes each pixel and looks at its brightness (when in luminance mode). On the WFM, you see a dot represent each brightness and that gets plotted to a column on the WFM.
So you just read the WFM left to right.

In certain cases, you may not need to care about broadcast-safe colors/levels. An example would be showing video on a LCD projector. Another possible example is DVD. For DVD, broadcast-safe is an issue if people have their TV hooked up to their VCR via a single cable.

RGB parade:
I don't understand what it's for. Maybe it'd be useful for component signals?

Gamma:
Not sure what you need to know.

Quote:
Is there any way to control mid-tones?
Yes.
Premiere Pro has a decent color corrector. You can adjust gamma to control overall brightness. You can use the curves, which is the same idea as Photoshop. You can add a tint to midtones via the middle wheel in the color corrector. And some other things.


Ok so that's what I know. Hopefully I'm right, and it all makes sense to you.

Last edited by Glenn Chan; July 12th, 2005 at 03:55 PM.
Glenn Chan is offline   Reply With Quote
Old July 12th, 2005, 09:49 AM   #3
Major Player
 
Join Date: Jun 2004
Location: McLean, VA United States
Posts: 749
Gamma can be a thoroughly confusing subject. The reason we have to deal with it at all is because the voltage output of sensors is not proportional to the light energy which falls upon them but rather to some power (mathematical - i.e. exponent) of the light intensity. Technically, gamma is the numerical value of the exponent.

Practically speaking gamma is the slope of the "curves" in Photoshop. Where the curve is steep, gamma is high, and contrast is enhanced. Where the curve is shallow (like in the highlights and shadows if you steepen it in the middle to enhance mid-tone detail) contrast and perception of detail are lost.
A. J. deLange is offline   Reply With Quote
Old July 12th, 2005, 10:42 AM   #4
Trustee
 
Join Date: Jul 2003
Location: US
Posts: 1,152
For digital video black is always at 0 IRE. For analog NTSC video in the U.S. black is at 7.5 IRE. Some consumer/prosumer DV camcorders output the video signal through their analog jacks without bumping the blacks to 7.5 IRE.

Premiere Pro 1.5 has a setting at the top of the scopes for displaying blacks at 0 IRE (no analog "setup") or at 7.5 IRE (with "setup"). This does not alter the video in any way. The video files remain untouched. It is only changing the way the video signal is displayed on the scopes.

The RGB parade waveform displays the luminance values of each color channel.

Gamma refers to the midtones.
Christopher Lefchik is offline   Reply With Quote
Old July 12th, 2005, 03:38 PM   #5
Inner Circle
 
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
Christopher, I may be wrong here but IRE is an analog unit for voltage which doesn't apply to the digital domain.
Glenn Chan is offline   Reply With Quote
Old July 12th, 2005, 04:59 PM   #6
Trustee
 
Join Date: Jul 2003
Location: US
Posts: 1,152
Glenn,

You're absolutely right; however, IRE units are still used in software waveform scopes. I suppose it's kind of like digital cameras still using ISO film-speed exposure sensitivity settings.
Christopher Lefchik is offline   Reply With Quote
Old July 12th, 2005, 06:11 PM   #7
New Boot
 
Join Date: Jun 2005
Posts: 6
wow. lots of information.

i still have a few questions about YCrCb.

from my vague understanding of how NTSC works, it is a composite waveform which instructs the electron gun in a TV on how scan the screen and the intensity of electron flow to determine color. and there's that funny little squiggle that tells it to change lines.

firstly. i never understood how color was achieved. is it something like a single-CCD still camera in that individual "pixels" are orientated such that each one always turns a certain color, or is each spot on the screen capable of turning any color/intensity? i'm a little confused about that.

what exactly do "luma" and "chroma" mean? one is intensity one is color value? like controling hue and saturation values? and why are there three for YCrCb... like what do each of those values mean in terms of the composite waveform or in terms of the signal displayed?

basically i'd like to know how exactly the wave "contains" the image information.
Matt Schoen is offline   Reply With Quote
Old July 12th, 2005, 07:18 PM   #8
Inner Circle
 
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
Backgrounder on NTSC:
http://www.danalee.ca/ttt/analog_video.htm
It's pretty easy to read relative to other stuff on the subject.

Quote:
firstly. i never understood how color was achieved. is it something like a single-CCD still camera in that individual "pixels" are orientated such that each one always turns a certain color, or is each spot on the screen capable of turning any color/intensity? i'm a little confused about that.
How color is achieved is explained in that article.
There are other designs that produce color. 1CCD designs have complementary color filters in front of each light-gathering element and that's how they achieve color.

Quote:
what exactly do "luma" and "chroma" mean? one is intensity one is color value? like controling hue and saturation values? and why are there three for YCrCb... like what do each of those values mean in terms of the composite waveform or in terms of the signal displayed?
As far as I understand it:

It's easiest to think of colors in terms of hue, saturation, and brightness (like in Photoshop). Chroma = both hue and saturation. Luminance = brightness.

Y Cr Cb:
Y stores the luminance information. Because our eyes see greater detail for brightness than color, video systems save space by storing less color/chroma information. For DV, there are 4 samples of luminance information for every 1 sample of color information (referred to as 4:1:1). See The pictures at Adam wilt's site for more information on color sampling / 4:1:1.

Cr stores red minus luminance information. If you add Cr to Y, you end up with only the blue information.

Similar, Cb stores red minus luminance information.

From Y, Cr, and Cb, you can figure out the green information (it's algebra). So from those three things, you can figure out the R, G, and B information for driving each of the electron guns in a TV set. As well, the luminance component is by itself so it can be stored seperately from the color information (so, you can store more luminance information than chroma information to "get" more resolution).

For information on this in relation to NTSC signals: http://www.danalee.ca/ttt/appendices.htm#colourencoding

2-
Quote:
basically i'd like to know how exactly the wave "contains" the image information.
If you're working from DV, there isn't really a "wave". That's only when you go to analog.

3- Does your head hurt yet? (Mine does.)

Glenn
Glenn Chan is offline   Reply With Quote
Old July 13th, 2005, 06:25 AM   #9
RED Problem Solver
 
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
http://www.nattress.com/Chroma_Inves...masampling.htm

Should also help!

BTW, that link you posted, Glenn is incorrect. NTSC does not use YIQ (it is a historical artifact, which was put into a few recievers, but never generally used). YUV is a state of video in the process of being converted from component analogue to composite or S-video.

Light sensors on cameras do tend to be linear. The gamma curve is applied to a) act as a form of compression so that 8bit video will not produce banding, and b) to counteract the non-linearity in the gamma of a CRT monitor.

Graeme
Graeme Nattress is offline   Reply With Quote
Old July 13th, 2005, 07:35 AM   #10
Major Player
 
Join Date: Sep 2002
Location: Belgium
Posts: 804
Quote:
Originally Posted by Graeme Nattress
http://www.nattress.com/Chroma_Inves...masampling.htm

Should also help!

BTW, that link you posted, Glenn is incorrect. NTSC does not use YIQ (it is a historical artifact, which was put into a few recievers, but never generally used). YUV is a state of video in the process of being converted from component analogue to composite or S-video.

Light sensors on cameras do tend to be linear. The gamma curve is applied to a) act as a form of compression so that 8bit video will not produce banding, and b) to counteract the non-linearity in the gamma of a CRT monitor.

Graeme
As far as I know NTSC still transmits YIQ signals. Receivers are being designed for UV demodulation because of the cost of the prefiltering and matrix needed for avoiding quadrature components in the chroma signals. when
Andre De Clercq is offline   Reply With Quote
Old July 13th, 2005, 08:04 AM   #11
RED Problem Solver
 
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
According to Poynton "Digital Video and HDTV Algorithms and Interfaces" (a book I highly recommend for technical readers) page 89,

"Y'IQ - Composite NTSC was standardized in 1953 based upon I and Q components that were essentiallu U and V components rotated through 33degrees and axis-exchanged..... ....The scheme never achieved significant deployment in receivers, and I and Q components are now obsolete."

page 312,

"Nearly all NTSC encoders and decoders built since 1970 have been based upon Y'UV components, not Y'IQ"

page 365,

"Television receiver manufactures found the NTSC's I and Q scheme costly, however and the scheme never reached significant deployment in receivers... ...In the 70s and 80s equiband U and V encoders became dominant. In 1990 SMPTE adopted standard 170M, which endoresed U and V encoding... ...Virtually no equipment today uses NTSC's Y'IQ scheme."

So no, Y'IQ is not transmitted any more.

Graeme
Graeme Nattress is offline   Reply With Quote
Old July 13th, 2005, 09:10 AM   #12
Inner Circle
 
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
Thanks for the correction Graeme.

This is why I'm not an engineer...
Glenn Chan is offline   Reply With Quote
Old July 13th, 2005, 09:35 AM   #13
RED Problem Solver
 
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
I guess I'm just well read then :-)

Graeme
Graeme Nattress is offline   Reply With Quote
Old July 13th, 2005, 10:53 AM   #14
Major Player
 
Join Date: Sep 2002
Location: Belgium
Posts: 804
Thanks Graeme. UV decoders I knew, UV (equiband) encoders/transmission I missed.
Andre De Clercq is offline   Reply With Quote
Old July 13th, 2005, 11:40 AM   #15
Inner Circle
 
Join Date: Mar 2005
Location: Hamilton, Ontario, Canada
Posts: 5,742
Quote:
Originally Posted by A. J. deLange
Gamma can be a thoroughly confusing subject. The reason we have to deal with it at all is because the voltage output of sensors is not proportional to the light energy which falls upon them but rather to some power (mathematical - i.e. exponent) of the light intensity. Technically, gamma is the numerical value of the exponent.

Practically speaking gamma is the slope of the "curves" in Photoshop. Where the curve is steep, gamma is high, and contrast is enhanced. Where the curve is shallow (like in the highlights and shadows if you steepen it in the middle to enhance mid-tone detail) contrast and perception of detail are lost.
From what I understand, gamma is roughly what we still photo types, especially those of us B&W f/64 wannabees steeped in the Zone System, refer to as "local contrast", controlled largely by the development of the emulsion. In film, if you draw an "S" curve of exposure versus density, the gamma would be the slope of the curve at the point of interest. A high gamma would mean there's a relatively little difference between the level of exposure that just cracks the development fog on the film and one that produces the maximum density the emulsion is capable of. A low gamma is just the opposite, with a long scale of luminosities producing densities between minimum perceptible above fog and the mximum density possible. The RANGE of density is the same, from pure white to pure black, but the RATIO of the amount of light required to produce a barely perceptible density compared to that producing the maximum possible is different.

For fellow "Zoner's", in video, the brightness control adjusts the toe of the curve to set the point where Zone 0 just breaks into Zone 1 while the contrast control adjusts the shoulder where Zone 9 enters Zone 10. The IRE Setup would be equivalent to setting the level of density in the photo negative that will print as Zone 1, in a way analogous to setting the print exposure time. The up-side "NTSC legal" limit would be sort of like selecting a paper contrast grade appropriate to the negative density range.
Steve House is offline   Reply
Reply

DV Info Net refers all where-to-buy and where-to-rent questions exclusively to these trusted full line dealers and rental houses...

B&H Photo Video
(866) 521-7381
New York, NY USA

Scan Computers Int. Ltd.
+44 0871-472-4747
Bolton, Lancashire UK


DV Info Net also encourages you to support local businesses and buy from an authorized dealer in your neighborhood.
  You are here: DV Info Net > The Tools of DV and HD Production > Open DV Discussion

Thread Tools Search this Thread
Search this Thread:

Advanced Search

 



All times are GMT -6. The time now is 04:54 PM.


DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network