|
|||||||||
|
Thread Tools | Search this Thread |
December 12th, 2012, 07:22 AM | #1 |
New Boot
Join Date: Jul 2012
Location: Edinburgh, UK
Posts: 21
|
1080 vs 720, i vs p
Hi all,
I know this is a topic that gets covered regularly - I've been doing some reading online but I've managed to confuse myself thoroughly. I understand that 1080 and 720 are simply resolutions, and that i and p are simply ways of displaying the same picture. Looking at my export settings in Premiere Pro, I can choose between 1080i and 1080p. I'm confused about how this relates to being displayed on a screen for a couple of reasons. I've read that TV stations only broadcast in interlaced, so what happens if I give them a 1080p file? And what about the viewer - they can choose 1080p or 1080i on their televisions sets too. Where is it decided whether an image is interlaced or progressive? Also, how does the resolution relate to the size of a screen? If I was to buy a 60" TV, surely a 1920x1080 image would have to be stretched up to fit it and it would look crap? Would there be any adverse effect on the quality of the footage if I was to film in 1080p, and place it into a 720p sequence within Premiere Pro? It would obviously have to be shrunk - would that have a negative effect on image quality? If so... How is it that 1080p looks just as good on a 42" television as it does on a 27" television? Surely the 27" television has to shrink the image to fit it on the screen? I hope that's not too confusing a post... it's something I'm struggling to get my head around. Thanks in advance, Jonnie. |
December 12th, 2012, 08:04 AM | #2 |
Join Date: Dec 2010
Location: Western Australia
Posts: 576
|
Re: 1080 vs 720, i vs p
To me, progressive looks more natural, similar to what the eye conveys light. Interlaced looks fake and shiny, studio like and glittery. :D
|
December 12th, 2012, 11:43 AM | #3 |
Inner Circle
Join Date: Feb 2007
Location: Apple Valley CA
Posts: 4,874
|
Re: 1080 vs 720, i vs p
You will find 1080 LCD displays, down to about 22" from what I've seen, but there are also "HD" displays that are ACTUALLY 720 - and if you get close, the difference is usually pretty obvious. You need to (carefully) review the specs for any given display, as a 720 display will indeed accept a 1080 signal, but will then "munch and crunch" it onto a 720 panel... you've apparently already surmised that the results may vary...
Ultimately, the goal IMO is to stay at the highest possible resolution/data rate as far into the production/display process as possible - progressive video should produce MORE data to work with than interlaced in any given capture device... more data = potentially more overall detail, and when and if the time comes to downsize or compress, you should have more input to feed into the program that is to do the compression/sizing - this of course may vary, but typically, the better the source, the better the output. Think of it in terms of "you can't recreate what wasn't captured", although there are some ways to try really hard, and with motion files, you can potentially span the timeline to recreate SOME additional data points. The key to understanding i vs. p is remembering that MOVING subjects will shift between "frames" or parts of frames - so with i you've got some potential for offset between frames (think of the typical "mouse teeth" you see in some poorly processed i video), where with p you should be dealing with complete frames, and only motion from frame to frame. Again, more complete "data" to work with. Hope this helps with understanding the concepts! |
December 12th, 2012, 01:30 PM | #4 | |||
Inner Circle
Join Date: Jan 2006
Posts: 2,699
|
Re: 1080 vs 720, i vs p
Quote:
It used to be the case that framerates were spoken of if progressive, FIELDrates if interlace. The standards bodies decided some years ago that correct nomenclature should be to always speak of framerates, even for interlace - hence the most common systems broadcast are 1080i/25 (Europe) and 1080i/30 (US/Japan). You will frequently hear them still referred to as 1080/50i and 1080/60i - same thing, but old terminology. In the UK and most of Europe, it's true that the main transmission system is 1080i/25, so yes, interlaced. To answer your question, there are actually three types of "scan type" possible: interlaced , progressive, or psf - "progressive, segmented frame". It may be worth looking at Progressive segmented frame - Wikipedia, the free encyclopedia . In brief, psf is a way of sending a progressive video over an interlaced system. Point is that the difference between p and psf is simply one of the order in which the lines are presented. In each case the data is the same - and it should be possible to transparently reorder and change from one to the other without loss. Quote:
Quote:
|
|||
December 12th, 2012, 05:18 PM | #5 |
Inner Circle
Join Date: Dec 2004
Location: Arlington, TX
Posts: 2,231
|
Re: 1080 vs 720, i vs p
Complex topic Jonnie. This has made our jobs a lot more difficult over the past 5-8 years.
First start with the available broadcast options: Standard Definition - US 720x480 60i EU 720x576 50i - Always Interlaced High Definition - US 1280x720 60p, 1920x1080 60i, & 1920x1080 24p (certain satellite providers only) I am not sure about the EU :) but we know 1920x1080 50i is a choice! As dave mentioned, movement will play a large part in the percieved image quality. If you have a static camera and no subject movement, 1080i will look as good as 1080p because the frames are duplicates of each other. If you are moving the camera the scene will start to show mis-matches. But remember, you can shoot in a framerate and deliver in another format. So these "standards" are just wrappers of the footage. 24p material has been delivered over interlaced signals forever. Where I find all of this stuff important is when you are deciding what format to shoot a project in. The framerates are where you can get into trouble, like shooting ice hockey in 24p. You will have way too much judder compared to 50p/60p. You can still deliver that material in a broadcast format but the judder will remain. So it is important to reverse engineer your project and choose your framerate first, then the resolution. My overall point is that the TV format is almost of no concern because whatever you shoot can be delivered. It is how you shoot it that makes it look good or not. Also, in my experience 720p looks fantastic on a 1080p television. A lot better that 1080p60...because you can not broadcast in 1080p60! - yet. So do not get caught up in numbers and specs so much. Quality is what matters. My father has a 720p plasma and I have a 1080p plasma from the same maker (Panasonic). I can see a little more detail on my TV but they both look great. Hope this is not rambling on! |
December 12th, 2012, 06:13 PM | #6 |
Major Player
Join Date: Mar 2011
Location: London, UK
Posts: 353
|
Re: 1080 vs 720, i vs p
The prospect of better viewed picture quality when the video format is progressive full frames compared with interlaced video at the same frame resolution is only necessarily true if the data bandwidth is greater for the progressive video. In other words, more image data requires more bitrate.
Its no coincidence that the current AVCHD camcorders that shoot 1080p/50 require higher bitrate streams when recording at that frame rate compared with either 1080p/25 or 1080i/25. This may or may not be an issue in the wide open spaces of Blu-Ray storage, but when it comes to squeezing video streams into multiplexed broadcast channels, the temporal effects of interlacing can be less objectionable than mpeg2 or 4 compression artifacts. The BBC for example only allows a bit budget of about 9Mbps for each of the four (soon to be 5) channels in the HD multiplex. When films are televised,the original 24fps film is scanned 4% fast and broadcast as 1080psf/25, which is seen by viewers as progressive. Interestingly, the view at the BBC seems to be that it is better to reduce the source image resolution and compress it less in the transmission chain, than try to maintain a high detail in the picture and squeeze it harder to fit the available bandwidth. The UK HD terrestrial broadcasts and their DBS equivalent services are originated as 1440x1080 frames. The original HDCAM studio equipment had this resolution, but much of that has since been replaced with 'full HD' kit. 1440 samples per line is still used though, despite the availability of broadcast statmux kit that could accept the full 1920 samples per line. I fully agree with David Heath on the real resolution of cameras, especially consumer products including SLRs used to produce video. They have abysmal resolution,yet the sales hype makes big news of the 1080 line recording resolution capability. Most SLRs struggle to achieve a genuine 600 LPPH image, and even that comes with a healthy dose of sub-sampling artifacts and moire patterning. Last edited by Steve Game; December 13th, 2012 at 02:17 AM. |
December 12th, 2012, 10:31 PM | #7 | |
Inner Circle
Join Date: Jan 2004
Location: Boca Raton, FL
Posts: 3,014
|
Re: 1080 vs 720, i vs p
Quote:
A showing HD content is displaying it in 1920x1080 resolution no matter if it's 42" or 27". Period. The image is not "shrunken" in terms of resolution. Size and density of the pixels varies, not the number of them. |
|
December 20th, 2012, 05:42 PM | #8 |
Regular Crew
Join Date: Nov 2011
Location: Myrtle Beach, SC USA
Posts: 147
|
Re: 1080 vs 720, i vs p
Why would you shoot in 720p/60p/30p/24p instead of 1080p I have read somewhere that 720p can be a very good looking spec?
Leon |
December 20th, 2012, 09:11 PM | #9 |
Major Player
Join Date: Dec 2010
Location: New Zealand, Rapaura (near Blenheim)
Posts: 434
|
Re: 1080 vs 720, i vs p
It might be to support a special function like a faster frame rate you want to slow down in post for smoother slow-mo. Higher frame rates might force you to lower resolution, and you can upscale 720 footage in a 1080 project and what you lose by upscaling is more than made up for with the smoother slow-mo.
__________________
Stills at: www.flickr.com/photos/trevor-dennis/ |
December 20th, 2012, 10:09 PM | #10 | |
Inner Circle
Join Date: Aug 2005
Location: Atlanta/USA
Posts: 2,515
|
Re: 1080 vs 720, i vs p
Quote:
Personal preference is also a factor. Some people simply hate interlaced video because it looks like TV, while progressive looks a lot like film. Sure, 1080P is the best of the bunch, but that comes at a price: significantly higher bitrate (limited in case of TV transmission); in other words 720P at higher bitrate might look better than 1080i at the same bitrate. Then again... the best is what looks best to YOUR eyes. |
|
December 21st, 2012, 07:03 PM | #11 | |||
Inner Circle
Join Date: Jan 2006
Posts: 2,699
|
Re: 1080 vs 720, i vs p
Quote:
All that has changed now. Cameras with true 1920x1080 resolution are commonplace now, even at relatively low price points, and screens with 1920x1080 resolution are now the norm. Use such for a comparison and the difference is far from small, believe me. Quote:
Quote:
But 720p systems are normally 720p/50 or 720p/60 - and as such have the same fluid motion appearance as 1080i. Casual viewing wouldn't be able to tell 1080i/25 and 720p/50 apart in the sense of "looks like TV" or "looks like film" - they are both showing 50 images per second, each captured 1/50s apart. Conversely, 1080p/25 will have the "film look" to motion - and may be carried over the 1080i transmission system as 1080psf/25. Which is exactly how most drama type shows are transmitted on European HD channels. Whilst the same transmission system is used for true 1080i/25 for such as sport or news etc programming. All this was gone into in great depth some years ago by the BBC research department - see http://downloads.bbc.co.uk/rd/pubs/w...les/WHP092.pdf . The underlying science is very sound - but the conclusion was out of date almost before it was published though. They hadn't anticipated just how big affordable screens would become very quickly, and based a "720p is good enough" recommendation on 37" screens being the norm for quite a while to come. In practice, 42" became the figure to go by, possibly bigger still, hence the same research points to a 1080 system. (As the BBC did indeed adopt.) |
|||
January 28th, 2013, 01:31 PM | #12 |
New Boot
Join Date: Jan 2013
Location: Calgary, Alberta, Canada
Posts: 13
|
Re: 1080 vs 720, i vs p
I like to shoot in 1080p and final edit to 720p. That allows me to recompose, zoom, tilt and pan in post-production without losing resolution. Seems to work for me. I agree 'p' looks more natural than 'i'.
|
January 29th, 2013, 12:47 AM | #13 | |
Inner Circle
Join Date: Dec 2004
Location: Arlington, TX
Posts: 2,231
|
Re: 1080 vs 720, i vs p
Quote:
So one would choose 720p60 to get 60p as it is the only way to get 60 progressive frames and deliver in it's true state. Also, bit for bit 720p gets less compression at the recording stage than 1080p. If you have a 50mbps bucket 1080p will fill it up faster than 720p. In my view, 720p mode looks more refined than the 1080p modes on the sub $15,000 cameras. Especially on the sub $6,000 cameras. I often shoot in 720p60 and it looks great. |
|
January 29th, 2013, 11:31 PM | #14 |
Inner Circle
Join Date: Aug 2005
Location: Atlanta/USA
Posts: 2,515
|
Re: 1080 vs 720, i vs p
Here's a solid article to document my point: on your average TV, there is no visible difference between 720P and 1080i.
Read the numbers. Why 4K TVs are stupid | TV and Home Theater - CNET Reviews A short quote: "A 720p, 50-inch TV has pixels roughly 0.034 inch wide. As in, at a distance of 10 feet, even 720p TVs have pixels too small for your eye to see. That's right, at 10 feet, your eye can't resolve the difference between otherwise identical 1080p and 720p televisions." Read the entire article... fascinating numbers about the physics of the human eye. Just to clarify, I am not talking about studio monitors or projectors. |
February 7th, 2013, 07:08 AM | #15 |
New Boot
Join Date: Jul 2012
Location: Edinburgh, UK
Posts: 21
|
Re: 1080 vs 720, i vs p
Thanks for all the replies on this. Apologies that I've not replied sooner.
It's certainly a complicated topic and I appreciate all the detailed responses. Most televisions have settings of 1080p and 1080i to select from. If broadcasters are sending signals in 1080i, what is the television doing when I set it to 1080p? Presumably it doesn't do anything and it's reserved for things like BluRay players? Similarly, my friend has his PC hooked up to his TV for use as a monitor. Despite living in the UK, I noticed that his TV was displaying in 60HZ. I'm assuming this is because PCs work differently from broadcast signals and it probably has a frame rate of 30fps+? I think I can feel my brain melting - does anyone know of any good books that I could pick up regarding the different video formats and their uses? I think it'd be nice to have as a reference - reading posts across various different forums often gets me more confused! So much terminology... Thanks again. |
| ||||||
|
Thread Tools | Search this Thread |
|