DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Open DV Discussion (https://www.dvinfo.net/forum/open-dv-discussion/)
-   -   Analog Broadcast IRE Levels (https://www.dvinfo.net/forum/open-dv-discussion/493607-analog-broadcast-ire-levels.html)

Steve Kalle March 25th, 2011 10:53 PM

Analog Broadcast IRE Levels
 
I help run a gov't access channel and much of the video appears too bright on regular LCD TVs. However, the exposure looks great on my Eizo CG243W via Premiere Pro in addition to a Panasonic BT-LH1710w via HD-SDI and Composite.

So, my question is this: do exposure levels need to be reduced for analog broadcast? I am waiting for Comcast techs to get back to me about whether their hardware is causing this issue. I know what the video looks like directly from the playback server which I have connected to the Panasonic; so, I am wondering if the analog equipment on the head-end is designed to work with lower IRE levels.

Battle Vaughan March 25th, 2011 10:59 PM

Re: Analog Broadcast IRE Levels
 
Standars US analog broadcast video is between 7.5 IRE and 100. Some interesting information here: 7.5 IRE Setup

Steve Kalle March 25th, 2011 11:08 PM

Re: Analog Broadcast IRE Levels
 
Yes, I know the standards of 7.5 to 100 aka 16-235 and caucasian faces exposed to 70 IRE, which they are according to the waveform.

Battle Vaughan March 26th, 2011 10:13 PM

Re: Analog Broadcast IRE Levels
 
The link I attached talks about the difference in analog and digital levels, which might be of some import, or maybe not.

I have noticed that NTSC video on LCD screens, which perforce undergoes some kind of conversion because LCD's don't write the analog scan lines as such, seems to be adversely affected. In fact, just browsing the LCD's lately at the local big box, I walked out empty-handed. The taped HD samples were stunning, the local tv broadcasts not so much.

I see the same thing on our LCD monitors when working with ntsc video. So maybe that conversion is a factor?

Steve Kalle March 27th, 2011 01:43 AM

Re: Analog Broadcast IRE Levels
 
Thanks Battle for the link.

Hopefully, I can get my nanoFlash to take the cable box's HDMI and convert to SDI so I can check the waveform on my 17" panasonic. If that works, then I can switch between SDI inputs: the other coming directly from the playback server.

Don Parrish March 27th, 2011 05:30 AM

Re: Analog Broadcast IRE Levels
 
I am not impressed with the local commercials broadcast by comcast. One commercial here in Central Florida from a west coast Harley dealer was terrible, for weeks it had bad bad overmodulated audio, unintelligible. I emailed the dealer and the problem was solved in a couple of days. The relevance of my post is a friendly reminder that comcast quality seems to have slipped over the years, I think you should monitor there broadcast and keep a mental note that comcast may be struggling with quality. Almost all local commercials in my area have over modulated audio, you would think they would monitor their own work !! I have a Sony that can be turned up very loud on national broadcast and movies, so it is not my TV.

Roger Keay March 27th, 2011 09:35 AM

Re: Analog Broadcast IRE Levels
 
Quote:

Originally Posted by Steve Kalle (Post 1632106)
Thanks Battle for the link.

Hopefully, I can get my nanoFlash to take the cable box's HDMI and convert to SDI so I can check the waveform on my 17" panasonic. If that works, then I can switch between SDI inputs: the other coming directly from the playback server.

You need to use standard test signals to get to the bottom of this problem. If possible, put color bars on the channel for a few minutes in the middle of the night. If the bars look OK, you know it is something to do with the levels in the program content.

Be careful with the test you are planning from the HDMI output. As far as I recall, the HDMI link should not have any setup as the norm for digital video is no setup. The cable box should remove the setup when performing the analog to digital conversion. An old TV with a video out connection might be a better means of getting access to the demodulated video from an analog channel as it won't be doing any fancy processing.

You can get bars and tone from PremierePro and create a file for your server. The color bars have a test pattern for checking setup level under the red color bar in the same horizontal scan lines as the peak white test reference. If the setup on your monitor is correct, the area in the bottom right of the screen should be black except for a slightly brighter vertical rectangle (about 1/3 of the width of a color bar) under the right side of the red bar. If the bar is not visible, the black level of the monitor is too low. If a similar bar is visible below the left edge of the red color bar that is darker than the rest of the area then the black level is too high.

Your post suggests that you are passing your signal to the cable operator using an SDI connection. You need to establish a clear understand of how the setup is being managed. Do the programs from your server have setup when they are sent to the cable operator or do you expect the digital to analog converter to add 7.5 units of setup? If the programs have setup and the converter adds setup then black is raised to 15 units and the image will be too bright, somewhat desaturated and lack solid blacks. Digital signals don't usually have setup so the digital to analog conversion should be adding it.

You are working with a public access channel which suggests that program material is coming from a variety of sources. You should probably standardize on no setup and inspect incoming material to ensure compliance. If you are ingesting NTSC analog signals from tape machines then ensure your video input hardware is configured to remove setup when it is present.

I hope some of these ideas may help and not send you down too many dead ends.


All times are GMT -6. The time now is 09:22 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network