![]() |
Quote:
Of course, distance is a perception factor, but it sure looks "day and night" to me when switching from the same movie playing 480P DVD, verses HD. Looking at a 50" plasma from about 15 ft. It's really not until you can do a direct comparison where things become more obvious. |
Quote:
Sure is a lot of math being talked about here. Strange, because not too many viewers watch TV with a calculator and test equipment in their hand... ;-) Here's a simple test for you: Shot a scene in 720P, and upconvert it to 1080i and 1080P. Shoot the identical scene in 10801i and 1080P. Compare the two. Native 1080 will CLEARLY look superior to upconverted 720. Downconvert the 1080 footage to 720. Compare it to the natively shot 720. It will match or look superior to the native 720 footage. Sorry lads. The proof is in the pudding, not in the mathematics. I'll put the XDCAM EX against any 720P camera out there. |
I think Werner still has a valid point.
Aquiring in 60p gives one more options in post if those options are needed. Jody, you are correct as well, 1080p will have more resolution than 720p, therefore the chance to show more detail. But I would be careful using 1080p and 1080i in the same sentence as 1080i & 720p are very similar. So yes, shooting at 24 or 30fps, 1080p would be a great choice, so the nod would go to the EX over the 720p cameras. But, only the video folks will notice the difference. odds are "normal" viewers will not see much if any difference. |
The EBUs stance is not that 720p is better than 1080i, period, but rather that it COMPRESSES far better, being progressive. And is therefore more suitable as a broadcast standard. That's a different matter to which may look the best straight out of the camera.
I'll accept they're right as far as it goes, but the problem with their stance is that it takes no account of 1080p/25 (broadcast psf). Strange, as so much material is made and transmitted that way. |
1 Attachment(s)
Quote:
Here is a image I made exactly like you said I should test it. I took a Canon D20 still image and cropped it at 1920x1080 pixels. I then down scaled a 720p version and scaled it back up. I faked a 1080i version by adding in a slight softness for reducing the interlace flicker and low pass filtering. Sure maybe true 1080p when watched as true 1080p is sharper but it is not "clearly" better and only tech heads watching it on a computer monitor would ever notice. This image does not give any numbers at all because I do not use numbers. I use my eyes and my eyes tell me that at the end of the day there is very little difference between 1080i and 720p. If you have a better example please show me because no matter what type of images I start with be it 3D rendered, digital still or actual 1080i still shots from HDV I find always the same to be true. It isn't exactly fair to compare 720p from DVCPROHD tape either since it only uses 960x720 pixels. We are talking true 1280x720 pixels. |
Quote:
|
No motion blur and oversampling... bad example
Quote:
2nd, a still photo with nothing moving, is a bad example, you should have some motion blur and similar frame rates. 60fps looks very unnatural compared to 24fps, when I saw showscan many years ago - it was overly sharp with no motion blur - the front rung on the wooden rollercoaster was sharp - which it is not in real life to our eyes or at 24fps. Also, isn't a still image a bit of a joke?! Interlaced looks bad as a still frame compared to a progressive frame but looks fine or better when seen in motion. Same with natural blur looks fine in motion and soft in a still. So I'd say you really need to compare video properly, you need to take still images from its in its native moving video format... in motion and more importantly watch the moving footage side by side. |
Quote:
Quote:
I will give you that slo-mo looks better at 720p. So if slow motion is needed, shoot 720p for those parts but for all the rest at 1080! That's what I do... Also, 1080p televisions deinterlacing 1080i footage looks fantastic! If a TV can do the math on the fly, imagine what software algorithms can accomplish! I'd say shooting 1080i now is "future proofing" for whenever 1080p60 becomes a broadcast format. I also agree that 1080i is less bit-efficient than 720p. But the difference is negligible. As far as the camera itself goes, 1/2" chips tend to give better color reproduction, latitude and of course nicer depth of field qualiities. All of this translates to a more professional image compared to the 1/3" chip cameras. I think all future prosumer cameras will HAVE to be 1/2" chip cameras in order to combat the low light problem. It really is a physics problem at this point and Sony is pushing the competition to this arena...which we will all benefit from! |
Quote:
I can make a video test as well but the detail is still going to look the same. All it is going to show is the difference between 540 fields runnign 60 times a second and 720 frames running 60 times a second. They both for the most part look the same and have the same exact type of motion and motion blur. One just has alternating fields that could flicker. Again this is not a 720p vs 1080i debate but showing how the 720p mode from a JVC camera or the EX1 can give very good results. In fact I would even go as far as to say that 1080i and 720p pretty much give you the same thing in the end. One is cleaner while the other may have a few slivers of detail that sneaks through every now and then. There is no "clearly" better format though. Even when you can have 100% true 1080p there just isn't that much of a huge difference. Sure it is a little more crisp but is it really worth that much extra bandwidth? 1080p 24p vs 720p 24p uses a lot of bandwidth for that tiny edge in sharpness. People get so hung up on detail and sharpness when that is only a tiny part of image quality. |
Quote:
I have a feeling 1080 24P and 30P is going to be a thing of beauty coming off these cameras. I'm real interested in the 30P for upcoming event work. I'll probably throw in some occasional 720 60P uprezzed to 1080 30P for over crank slow-mo. |
Quote:
Of course 1080i creates an optical illusion of solid frames but that doesn't change the fact that it is still 540 pixel fields alternating. Of course this makes it look like 60p but it can have detail flickering if those details are too small. In order to make sure you do not have details that are 1 pixel thick you have to use low pass filtering which in the end bring the sharpness down. The image I posted shows that. The fact that it is interlaced doesn't reduce the level of detail at all. It is the low pass filtering that reduces the detail of 1080i and the jagged edges that are the problem. If you do not agree with what I have done please do your own test and show me I am wrong. I have yet to see any examples of how 1080i is that much better then 720p. |
Quote:
|
Quote:
When blended together we get 1080 lines. On a good enough TV (Pioneer Elite series or something) 1080 lines definitely look better than 720. Sitting closer to the TV allows you to see an even greater difference. Even on the 55" projection HDTV that I have at home FOX and ABC look softer than NBC and CBS broadcasts. This was BEFORE I found out how they broadcast. Originally I thought Comcast was giving those channels less banwidth but now I know why the picture looks softer. I probably wouldn't notice the difference on a 40" or smaller HDTV though. Like Jody Elred said, try converting 1080i to 720 and then try the reverse. 1080 yields better resolution. And since there are reports that this camera has 1,000 lines of resolution on the charts users should definitely benefit from shooting in the 1080i format. But really, perhaps all this debate is null when we look at surveys. Contrast ratio is number 1 for how picture quality is perceived. The 1/2" chip offering should help that a lot. And number 2 is color saturation/accuracy. Hopefully Sony gets these right! |
Quote:
And enough of the simulated photo-based examples: either compare footage from actual cameras or wait until someone else gets a chance to do so. Thanks. |
Quote:
Oh, by the way, just because a particular codec processes the pixels differently doesn't mean it's not full HD. 1440 x 1080i will view on a monitor as 1920 x 1080i. 1080i is the best we have until the broadcast bandwidth problems are solved, and 1080p becomes the high def standard. |
But 1080i isn't superior... Interlace is a dinosaur from the 1930s when tube technology couldn't scan full-resolution images fast enough to give the perception of a moving image. All new LCD technology is progressive and has to de-interlace interlaced signals to display them... and how well that is done really depends on the set.
Mathematically 1920 x 1080i has the edge over 720p, but 720p has the edge over 1440 x 1080i. 1920 x 540 x 60i = 62.2 million pixels per second 1280 x 720 x 60p = 55.3 million pixels per second 1440 x 540 x 60i = 46.7 million pixels per second 720 x 240 x 60i = 10.4 million pixels per second However, what does it look like? Personally I think all progressive images look preferable to similar spec interlaced images. Our eyes do not see interlace, and surely the ultimate goal is to match the image to our eyes? Although I find 720p60 a touch unnerving because the motion seems a little too fluid... It takes away the dreamy 25p haze! It's interesting how it's partly cultural. A higher number of North Amercians seem attached to interlaced than Europeans. I suspect it is to do with the fact that we Europeans have been seeing 25p (or psf) every-time we watch a film on TV for the past 60 years. Whereas there hasn't been any historical broadcasting of 30p (or real 24p) in North Amercia, given that films for 60Hz regions have that awful kack-handed 2:3 pulldown. (see, it's cultural). |
What some of you all don't understand is the networks' decision on which High Definition standard to go with was not too much different than some of our arguments. For example, the U.S. networks CBS and NBC overwhelmingly decided to go with 1080i simply because the top executives could not accept 720p as high definition. ABC and Fox felt that progressive would be better for the majority of their shows which are sports. Although, they were hoping for 1080p; they felt that the faster motion of sports would view better using 720p.
|
Quote:
Screens with 1920 horizontal resolution have only become a reality at sensible prices in the last year or so, and same with cameras - for 60Hz standards, HDCAM is 1440 horiz, DVCProHD (1080) is 1280. Only now is 1080 starting to mean 1920, if you like. So the networks arguments may well have gone along the lines of "why go with a format whose undeniable advantage (horizontal resolution) can't really be taken full advantage of in practice?" Fair point - but things have moved on. HD blue laser discs are full 1080p, and many screens sold now are 1920x1080 - and likely to approach 100% before very long. Differences that once didn't matter now increasingly do, and the danger for ABC and Fox is that more and more people WILL notice differences as screen and camera technology progresses. But hopefully the current 1080p/25 and 1080i/25 mix will go to 1080p/25(50), when hopefully everybody will be happy. |
Quote: From ABC executive on decision to go with 720p...
"They usually fail to mention that during the time that 1080i has constructed a single frame of two million pixels, [1/25] second, 720p has constructed two complete frames, which is also about two million pixels. In a given one-second interval, both 1080i and 720p scan out about 60 million pixels. The truth is that, by design, the data rates of the two scanning formats are approximately equal, and 1080i has no genuine advantage in the pixel rate department. In fact, if the horizontal pixel count of 1080i is reduced to 1440, as is done in some encoders to reduce the volume of artifacts generated when compressing 1080i, the 1080i pixel count per second is less than that of 720p". This 720p vs. 1080i argument has been going on a while now. The truth is that in the late 90's different networks signed contracts with either Sony, in the 1080i camp, or Panasonic, in the 720p camp. The amazing thing is now the EX does both. Life is good. |
[QUOTE=David Parks;774647]This 720p vs. 1080i argument has been going on a while now. The truth is that in the late 90's different networks signed contracts with either Sony, in the 1080i camp, or Panasonic, in the 720p camp. QUOTE]
Bingo. When they signed those contracts nobody had HD televisions in their houses and the equipment to produce in HD was such an investment that the only way to convince networks to use it was to have the manufacturers foot most of the bill. Hence Sony or Panasonic got to choose the format because they were paying for it. I really don't think pointy-headed pixel peeping (that we like to talk about) had much bearing on the choice for the execs. IMHO |
With the EX1 capable of both 720p or 1080p and others, does that give it an edge over the 250 in this one area? Or does the JVC claim that 720p is here for the foreseeable future, and the ability to easily "uprender" 720p to 1080i have enough merit to even it up a bit?
|
If you want 24/25fps motion ("film-look") then the EX should have a big edge theoretically because of the 1080p/24(25). It's when you want 50/60Hz motion that the argument gets more complicated. I don't think anyone will argue 720p/25 is superior to 1080p/25, if you're starting with 1920x1080 chips.
|
Quote:
Kevin you are thinking with numbers and not with how it looks. Resolution is not everything and the bigger an image gets the harder it is to notice the extra detail. You cannot just say 2.25 times larger and leave it at that because that tells nobody how it will actually look. Take a 1080i sample and down convert it to 720p. sure you may notice a slight detail loss but not 2.25 times worth of detail is lost. If you do not trust my sample which comes from a higher quality source then if I would have done it with a real 1080i then do it yourself. Sure this is a photo but this test actually gives more of an edge to 1080 since the source is such high quality. |
Quote:
There is also the fact that if you are shooting 24p/25p you will get much better compression compared to the 1080p flavor. 720p 24p uses 2.5 less frames then 60p so the 35mbits used will result in much higher quality compression. The 1080p sample would need a bitrate higher then 60 mbits/s to get the same compression quality as the 720p sample. So sure maybe the 720p will have less detail but it will have better compression. If you are shooting an action movie I would rather use 720p because it should be very hard to break the codec. 720p also has the option of shooting slow motion segments which is something you can't really do with 1080p. I would much rather my film have all it's shots have the same level of detail and stick with 720p for a consistant look then alternate between the two resolutions. 720p 24p/25p gives an artist a lot more creative options and lower compression at only a slight loss at detail. |
Quote:
It's never about the numbers, it's only partially about the numbers, and numbers always take a back seat to how the image actually looks; plus there are other factors that need to be considered such as viewing distance for example... there is a point where the viewer's distance from the screen renders moot any difference in "resolution." Thanks Thomas, |
Quote:
|
Quote:
Of course, I don't know how we got started on 1080i vs 720p as the EX 1 shoots both. I believe the EX 1 is a better camera than the JVC 250 because mainly of the sensor size and higher bit rate. I feel the 250 is overpriced for what it is. |
Quote:
But I agree with previous posts that too much shouldn't be read into some network decisions, they are not always made for the best technical reasons. |
Hey gang, the EX1 will be shipping in a few days and we can start looking at some actual footage to compare that, instead of speculating about it based on technical specs. :-)
Does Blu-ray support 720p playback at 60 fps? |
Quote:
Tom |
uprezzing 1080i50 to 1080p50 of a scene with (a lot of movement) asks for deinterlacing first. Therefore yielding a lower quality 1080p50 compared to 720p50 uprezzed. If the original 1080i footage isn't interlaced first you'd get jagged edges on movement.
Anyhow, I do a lot of people get me wrong. interlacing is a very nifty trick. It solved the original problem of tubes years and years ago for instance. It is just an old fashioned standard nowadays with virtually no benefits, since all flat panel TV's and monitors are natively progressive. They do accept 1080i, but have to deinterlace it to show it on their screen. interlacing is only seen in full quality on CRT's and that's whats in the screen technology natively. Since virtually all equipment is progressive, codecs are adapted to progressive better and so on, why retain the old standard that was only invented because TV's couldn't handle full images at onces (i.e. progressive)? How many that have a 1080i camcorder actually have a 1080i capable CRT monitor, I wonder... The mathematics are simple enough, I don't have to repeat them. 720p50 packs over 46,0 million pixels per second - 1080i50 packs theoretically just under 38,8 million pixels per second*. I know what footage I'd like to have on my NLE, especially if I want to do a lot of time-stretching effects. * in reality it is even less, because if 1080i resolved all 1080 lines, horizontal details of 1 pixel thick would cause severe interlace flicker. To avoid that and keep the image steady, the interlaced smears details out over several lines - resulting in a small gaussian filter effect - working out to a resolution loss of about 30%. More realisticly, 1080i50 packs just over 28,0 millions per second. All this combined makes it difficult to believe 1080i is still here. Although, why would sony have 1080i only on older models and on recent models also the capability of 720p? 720p and 1080i are both standards that'll be lost soon enough and make way for a 1080p50/60 standard, but in the meantime, I'd suggest shooting in 720p50/60 and if you really want 1080i50/60, you can always DOWNsample your original 720p50/60 to 1080i50/60... (720p50 to 1080i50 is indeed downsampling 46M to 28M pixels per second) |
All absolutely true if you need 50 or 60 Hz motion. But for most drama etc, 25Hz motion is chosen for aesthetic reasons, and then we're not talking about 1080i/25, but rather 1080p/25. But yes, the sooner 1080p/50 sweeps both 1080i AND 720p away, the better.
|
Quote:
|
My understanding is that the EX1 has a 1080p recording mode, so if it maintains full resolution at that setting it should look pretty good compared to the JVC cameras. And there are other reasons to expect the EX1 to be stiff competition, so it's not just about pixel counts and frame rates.
|
I have done a few tests now with motion video. I have created many samples of animated video including simple fast moving objects and scenes that have a lot of 3D rendered hair and fur. In the end with proper filtering the 1080i versions and the 720p both playing back upscaled to 1080p 60p look almost exactly the same. I would share these samples but I cannot think of a way to share 1920x1080p 60p video with you guys. All the tests I have done were in special playback software I wrote so it could run the video at 60p.
I would say the 1080i was about 1% sharper which was kind of an illusion really. The reason why it looks sharper is because of the interlacing. Because the video is interlaced you end up with 1080 unique lines of detail that are all different. With progressive video the lines tend to blend into each other making a more natural looking image. With interlaced however you do not see this blend as much because of how the fields are split. 1080i does work very well for HD and while there can be edge artifacts they do not show up as much as some would think. 1/60th of a second is pretty fast and before you could notice anything missign the missing part is replaced with the next field and so on. The beauty of 1080i however is that even when you have a single field it still looks like a 1080 image. A single field with the lines alternating will look much sharper then a 1920x540 rendered image so you cannot say 1080i only has 1920x540 per field. Sure it has that many pixels but they are spread out better so they look more detailed. |
Quote:
When smooth and clear motion capture is needed, 1080p is not available. You'll have to shoot 720p50 or 720p60. So there remains a conflict between fine detail (1080) and clear motion (progressive). The only thing that has changed is that before we had to select Sony or JVC -- while now we can select between formats on the EX1. Whoever can get 1080p50 and 1080p60 in production -- gets the prize. (I suspect JVC will announce at NAB 2008 as they are committed to progressive.) What's odd is that the EX1's CMOS chips and DSP are already working at 50Hz/60Hz. It simply isn't recording it. Which is strange because solid-state recording shouldn't be a limiting factor. |
Quote:
|
Quote:
This is why you can't use animated graphix as base. When you resampled that to 1080i, probably no anti-flickering was added. You haven't checked your 1080i on a 1080i CRT, I suppose. Furthermore, most motion graphics aren't subject to the compression that is less suited to MPEG compression, since there's no higher order correction for redundant info in between fields. 1080i has just 1920x540 per field (or 1440x540). On a CRT without deinterlacing filter is seems to have 1920/1440x1080 per frame, still 1920/1440x540 per field. If you really want to test: stitch a few videos in 720p50 together to get 1080p50. Then downsample to 720p50 and to 1080i50. The difference is there. I'd say it is quite something, others don't mind or don't see it - perception is subjective and it need to be said both 720p and 1080i look rather good (especially compared to SD, even PAL). When you slow down the 1080i and the 720p it is a whole different game: no-one remains that doesn't see the difference. |
Quote:
What I have found is that even if you did start with totally unfiltered video the 1080i version still wasn't really that much more detailed and they both ended up pretty much looking the same in terms of detail although unfiltered the 1080i looked like garbage because it flickered all over the place while the 720p still looked very clean. I have even stritched together uncompressed live SD captures to create fake HD but this can only go so far because SD cameras tend to filter more then what you need with HD. This is also a almost useless test for motion because you have no way to capture 1080i 60i and 720p 60p at the same time to test the motion. I mostly use this method to test encoders and processing software. As much as I hate 1080i it really isn't fair to call it 1440x540. If you create a image that is 1440x540 pixels in size and then one that is 1440x540 but with space alternated in between the second one will look sharper. 1080 with every other line duplicated will look sharper then 540 pixels blown up to 1080. It is all about the illusion of detail and that is exactly what it does. It doesn't really work to think of interlaced in terms of pixel size because there are so many optical illusion factors in place that it just doesn't work. |
Quote:
That's why the ball is now in JVC court. They could do a handheld version of the HD250 and/or push the art with 1080p50 or 1080p60. Europe needs 1080p50 immediately! Or, give us ProHD 422. Or, all of the above. A harddisk will take the higher data-rates needed for these. |
All times are GMT -6. The time now is 03:23 PM. |
DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network