Quote:
Originally Posted by Adam Stanislav
(Post 1582885)
If I recall correctly, way back when the old B&W TV aired at 30 fps (60 fields/s) in North America and 25 fps (50 fields/s) in Europe. When color was added, they wanted to be backwards compatible with the existing B&W receivers, so they did two changes.
|
Well, more than two changes, but so far so good.
Quote:
(1) They continued airing the B&W signal and added the color difference (as is still done in many digital codes, such as the various MPEG standards).
|
Well, the concept is correct, and they certainly did what they did to maintain compatibility with all the black and white tv's, but that's not how color was encoded. You're thinking in the current scheme of things, but back then the color was added via a color subcarrier, making what is called a composite video signal. The NTSC color subcarrier was at 3.58MHz, and the PAL subcarrier was at 4.43MHz. The component encoding system, which does indeed use color difference (R-Y, B-Y) is something else entirely, and is in common use today. Pretty much all digital schemes use component (color difference) now.
Quote:
(2) They slowed down the fps by 1.001, so it became 30000/1001 in North America and 25000/1001 in Europe. They did that so they could fit all the additional color information into the existing bandwidth.
|
Umm, no on both counts. PAL was never involved with the 1000/1001 stuff; its frame rate in the analog days was 50.00 as it still is today. And the reason for the frequency change in NTSC land was to avoid interference between the 3.58MHz chroma subcarrier and the 4.5MHz broadcast sound carrier; it had nothing to do with fitting the new color info into the channel bandwidth.
Quote:
We are still stuck with this anomaly because when we switched to all digital TV, they still wanted to stay compatible with all the analog TV sets (with just a converter box added) that people owned or perhaps still do.
|
I should just give you this one :), but arguably, this was due to standards committee politics; many of us felt it was time to give up the stupid, no longer necessary, 1000/1001 stuff. There is no need for it technically, including converter boxes.
Quote:
I can only hope that we will eventually abandon this anomaly and go to true 24/30/25 fps in the digital world. The way we are doing it is a pain for those of us who write video editing software because sound continues to use Hz, or cycles per second, so keeping everything synchronized when you trim clips and move them around requires you to be aware of the different timing. It is not too difficult with the 24/1.001 video fps as you can think of 48 kHz as 48048 cycles per 1.01 second, so you have exactly 2002 sound samples per 24/1.001 fps video frame. But it does not workout as neatly with the 30/1.001 fps video where you have exactly 8008 sound samples per 5 video frames. And it gets even more complicated with 25/1.001 fps, where you are perfectly synchronized only once every 25 frames. That is another good reason to shoot at 24/1.001 fps as opposed to 30/1.001 or 25/1.001 fps.
|
I don't question the difficulties and extra work you have to go through, and I wouldn't mind not having to say 29.97 or 23.976 ever again, particularly since there is really no technical need for the 1000/1001 stuff anymore, but I do wonder where you're getting "25/1.001 fps" footage...
Billy
ps For those that want to learn more (or check to see if I just made this all up), click
here for a little that Wikipedia has to say on the subject of frame rate.