View Full Version : Whats wrong with new full HD televisions?


Peter Berger
January 3rd, 2013, 08:18 AM
My brother bought a new full HD 3D TV (LG 42LM670S) and he's complaining that the image is so perfect and every film looks like video shot on a cheap small-chip camera.
Few days later my friend told he saw Ridley Scott's Robin Hood on full HD TV and it looked like a video too.
Whats wrong? I dont think its the resolution, because movies in cinema doesnt look like vide at all.

Tim Polster
January 3rd, 2013, 08:40 AM
It is the Hz the television is running at. The makers are supercharging this number to get "smooth" playback. See if you can set it to the normal numbers (50Hz or 60Hz depending upon region).

Yep, all of these hoops to jump through for proper 24p production and the TV's make it look like interlaced.

Eric Olson
January 3rd, 2013, 10:42 AM
My brother bought a new full HD 3D TV (LG 42LM670S) and he's complaining that the image is so perfect and every film looks like video shot on a cheap small-chip camera.
Few days later my friend told he saw Ridley Scott's Robin Hood on full HD TV and it looked like a video too.
Whats wrong? I dont think its the resolution, because movies in cinema doesnt look like vide at all.

It is difficult to diagnose without more details. Is it possible a composite video cable has been used to connect the movie player to the television? Is it possible "smooth motion" has been turned on?

Using a composite video cable will degrade the signal from a bluray disk player. This degradation becomes quite noticable on the clear image provided by a full HD television.

Smooth motion interpolates 24 fps source to obtain smooth 120 fps output. This is great for sports, however, in the context of narrative fiction the stuttery motion of 24 fps often serves as an artistic constraint to the extent that the aesthetic of the movie is spoiled by realistically smooth motion.

Oren Arieli
January 3rd, 2013, 11:16 AM
I had the same issue with a new TV (60" Vizio LCD) purchased recently. As a videographer, I hated that it made everything look like a soap opera. First step was to disable the fast-scan, then I chose the 'movie' mode (most TV's have viewing modes for various looks). You can then further tweak the settings to your liking. It made a huge difference. If the TV is in a default or demo mode, it might be tweaked to emphasize the contrast and saturation, which helps it to stand out on the showroom floor.
You might also look into a TV calibration disc/BluRay. It will help customize the image for your viewing conditions.

Steve Game
January 3rd, 2013, 11:36 AM
Interlacing was introduced by the original TV development team in the UK in the early 1930s. It was done because they knew from actual tests that for most visual programming, the resultant judder of moving images samples 25 or 30 times per second would give a poor viewing experience. This was instantly recognisable with programmes that had a mix of studio video and outside shots usually on 16mm film.
Cinema programs were originally shot at 16 or 18fps. Film makers pushed for faster frame rates but the distribution chain, (cinemas) decided that the additional cost would hit their profits. The result was a compromise, i.e. 24fps, which the industry has sold ever since. There's nothing magical about 24fps, in fact the film industry has had to try and minimise the impact of frame judder by very restrictive zoom and pan practice, and wide gate angles to introduce an amount of motion blur.
Now with digital cinema, the compromise can be lifted with a relatively low overhead, in just the same way that HD cameras and camcorders are increasingly being introduced with 50p and/or 60p modes.
With Peter Jackson's 'The Hobbit' released in a High Frame Rate version getting a barrage of complaints from established film critics, lessons will be quickly learnt, firstly with sequels to 'Avatar' that James Cameron has promised in 48fps, then by an increasing trend from all other major directors. 24fps films will no doubt continue for years to came just like the occasional black and white works appear, mainly for effect. I think the nostalgic flicker and judder with (relatively) low definition images will be regarded as unacceptable for anything serious.

Bruce Watson
January 3rd, 2013, 11:43 AM
Whats wrong?

Typically, user error.

Modern HDTVs are typically capable of outstanding picture quality -- IF -- they are set up correctly. And hardly any are.

Wacharapong Chiowanich
January 3rd, 2013, 11:50 AM
Same situation here in my area. Most newer 1080p HDTV sets on sale have multiple PAL display frame rates i.e. 100Hz, 200Hz or even up to 400Hz. That's up to 8 interpolated fields/frames for every raw 1080i50 input or 16 interpolated frames for every raw 1080p25 input etc. Since all PAL HDTVs can also playback NTSC content via HDMI and component inputs, if you plug in a Blu-ray player playing back a Hollywood movie then what you're watching on the connected HDTV is either 120Hz, 240Hz or 480Hz and NEVER a 24p original recorded on the disc.

These new HDTVs seem to be a great leveler in terms of making all kinds of videos displayed on them look more or less the same, which is to say like a video regardless of where or on what level they are originated. You may be able to tell the difference by the lighting, depth of field or apparent sharpness or resolution you are seeing but you will never be able to tell by the perceived motion on the screen.

I've shot videos in all the different frame rates e.g. 1080i50, 1080p25, 1080i60, 1080p24 (with or without pulldown), 1080p50 and 1080p60 and they all look the same on my 100Hz/120Hz 40" HDTV. All Blu-ray movies look pretty much the same motion-wise despite a vast difference in gamma levels, sharpness and audio quality.


I too don't understand why people jump through hoops to produce content in 24p when most viewers in front of the TV sets either can't see it or don't have a clue. It makes a difference only if the content gets projected in the theaters or played back on the computer monitors (even here the issue of refresh rates will come into play to make the perceived motion looks different).

Peter Berger
January 3rd, 2013, 01:22 PM
If I understand it correctly - when I buy a movie on DVD/Blu-ray or I watch a TV broadcast, its 25fps (I live in Europe), but television (in normal 50hz mode) will duplicate every frame? So on TV I'm watching actually 50fps? And the 100hz TV will interpolate even more frames?

And the last question - when I watch movie at the cinema (with DLP projection), there is no motion interpolation? Its just 24fps in USA and 25fps in Europe?

Chris Medico
January 3rd, 2013, 01:33 PM
It isn't strictly a frame rate issue. It is a high frame rate with interpolation that is the issue. The interpolation feature goes by a few different names depending on brand but look for something like "smooth motion" or some such feature and turn it off. That may not change the display frame rate of the panel itself but should turn off the interpolation. That will restore the visual cadence of the material.

Also consider getting the TV calibrated to bring the saturation and contrast range back down to realistic levels.

Peter Berger
January 3rd, 2013, 02:34 PM
How do you calibrate the TV?

Chris Medico
January 3rd, 2013, 02:39 PM
You will need some calibration hardware and software such as this:

Datacolor Spyder4TV HD Color Calibration System S4TV100 B&H

You can also use a calibration DVD.

Here is a basic outline - CNET's quick guide to TV calibration - CNET Reviews (http://reviews.cnet.com/4520-6463_7-5085739-3.html)

David Heath
January 3rd, 2013, 03:53 PM
Interlacing was introduced by the original TV development team in the UK in the early 1930s. It was done because they knew from actual tests that for most visual programming, the resultant judder of moving images samples 25 or 30 times per second would give a poor viewing experience. This was instantly recognisable with programmes that had a mix of studio video and outside shots usually on 16mm film.
That's not quite right. There are two separate issues - "flicker" and motion portrayal or "jerkiness".

To get reasonably smooth motion, it's enough to present the eye with something like 16 separate images per second, though yes, 24/25 is better. If you could blend one image into the next, that would be fine without extra complication - and that's more or less the case with something like a Steenbeck for film. (Which uses a rotating prism.)

What you describe as the poor viewing experience is due to the pulldown necessary for film projection, and the fact that the screen has to be blanked out whilst the claw pulls the film down. If that was all that happened, then the flicker would make viewing very uncomfortable. Hence the bladed shutter. It gives an extra blanking out of the screen each frame, increases the flicker rate to 48Hz and takes away the discomfort. (Just.)

But what about the early days of TV? The display needed to be scanned about 50 times per second to overcome the flicker problem, even though 25 frames is good enough for motion portrayal. And CRTs of the time had practical limits on roughly how many lines and fields they could scan per second. Worth looking at: Interlaced video - Wikipedia, the free encyclopedia (http://en.wikipedia.org/wiki/Interlaced_video#History) and especially :
In 1936, when the analog standards were being set in the UK, CRTs could only scan at around 200 lines in 1/50 of a second. By using interlace, a pair of 202.5-line fields could be superimposed to become a sharper 405 line frame. The vertical scan frequency remained 50 Hz, so flicker was not a problem, but visible detail was noticeably improved. As a result, this system was able to supplant John Logie Baird's 240 line mechanical progressive scan system that was also being used at the time.

I believe interlace had previously been patented by RCA (?) in the States, and wasn't a British invention. I can't remember the exact story, but I think EMI or Marconi were allowed to use the patent under some agreement - but Baird wasn't. And the rest is (literally :-) ) history.

There was a recent exhibition at Alaxandra Palace to mark the 75th anniversary of "High Definition" UK broadcasting, and part of it included a simulation of the Baird system, complete with 25Hz flicker. Big question is how anybody could have ever thought it would be acceptable? The answer was that it was far better than the 30 line system that had preceeded it! But one can only imagine the sinking feeling in the stomachs of the Baird team in 1936 when they first saw an EMI-Marconi broadcast.......

Giroud Francois
January 3rd, 2013, 03:57 PM
most TV are also set to clean the noise (when not already cleaned by compression) so the picture looks plastic.

Ron Evans
January 3rd, 2013, 07:12 PM
Most 120hz and 240hz interpolating TV's can be set to display 24p from a Bluray player over HDMI. But must be setup to do this otherwise will interpolate rather than emulate a multiblade film projector. Personally I have mine set to interpolate as I really dislike the juddering of 24p and frequent bad camera work and editing !!!

Ron Evans

Tim Polster
January 4th, 2013, 08:41 AM
Is there a TV setting to get rid my bad camera work? :)

Laurence Janus
January 4th, 2013, 10:03 AM
This article covers some of the things discussed here! The best thing I ever did for my TV was turning off MotionFlow

Prolost - Blog - Your New TV RuinsMovies (http://prolost.com/blog/2011/3/28/your-new-tv-ruins-movies.html)

Ron Evans
January 4th, 2013, 12:41 PM
Nice article describes the issues well. The other thing that the interpolating displays do well that isn't mentioned is to de interlace better than a simple display. Something noticably better on my 240hz Sony than my Panasonic plasma. I too like the picture of plasma though.

Ron Evans

Steve Game
January 4th, 2013, 02:12 PM
I believe interlace had previously been patented by RCA (?) in the States, and wasn't a British invention. I can't remember the exact story, but I think EMI or Marconi were allowed to use the patent under some agreement - but Baird wasn't. And the rest is (literally :-) ) history.

Thanks for the correction David, I should have checked myself. The EMI team were the first to establish a High Definition (sic) TV system spec for the 1936 launch - the first ever TV video compression method!
As you implied, the Baird Nipkow system, despite being progressive, was seen to be totally irrelevant when 405 line TV appeared, and system A served us well into the '80s. Its a shame that in the UK, the lay answer to the question of who invented TV is usually John Logie Baird, not Alan Blumlein's team at EMI.

Steve Game
January 4th, 2013, 02:21 PM
most TV are also set to clean the noise (when not already cleaned by compression) so the picture looks plastic.

Video noise, whether present at source or inherited from the production/transmission chain will consume compression bandwidth as each frame is different. As the last stage in the chain, broadcast transmission usually has the lowest bandwidth and the most aggressive compression, so the codec will just reduce the effective resolution of the video so that it looks 'plasticky' when the edge sharpening does its thing. If the edge sharpening is reduced, the plastic effect will be reduced albeit with an overall softer picture.

D.J. Ammons
January 7th, 2013, 10:29 AM
Going back to the initial post on this thread I agree with those that say the issue is the hz the TV is running at.

I jumped on the 120hz bandwagon when they first came out and promptly found out that while it was good for sports it made filmed shows looks like video and completely lost the filmic look we are conditioned to.

My family hated it and I immediately disabled the 120hz feature and have run it at 60hz. The difference is amazing.

Eric Olson
January 9th, 2013, 09:39 AM
At 60hz is it possible to play 24fps at exactly 24hz or do you get the 2:3 pulldown cadence from telecined film into video?

Chris Medico
January 9th, 2013, 09:44 AM
At 60hz display rate you will have pulldown for 24/23.97 sources.

Panagiotis Raris
February 4th, 2013, 07:10 PM
first thing to do with a new tv is turn every 'feature' off. then on one by one, see what/how they change the image and if you like them. i prefer all the true-motion/smooth-motion/120+ hz crap disabled. same with the automatic adjusting contrast, ultra contrast or color modes, etc.

i calibrate mine with my DSLR and images i have taken to my liking. This is a long-standing 'war' in my house; the woman likes most of that stuff on, hates 'jittery' 24P, and likes everything ultra colored and with all the contrast the TV can supply. That said, she is a far better photographer and editor than i; ill just never understand her tastes lol. Her preferences even make the ultra flat skittles commercials look like they are all lego people.