DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Panasonic LUMIX S / G / GF / GH / GX Series (https://www.dvinfo.net/forum/panasonic-lumix-s-g-gf-gh-gx-series/)
-   -   Mjpg 24p 1280x720 (https://www.dvinfo.net/forum/panasonic-lumix-s-g-gf-gh-gx-series/237502-mjpg-24p-1280x720.html)

Anna Uio June 17th, 2009 09:35 AM

Mjpg 24p 1280x720
 
Hi all,

Anyone know if one of the GH1 modes is Mjpg 24p 1280x720? Is it pulldown? How's the quality, i.e. how bad are the compression artifacts?

Thanks,
Anna

Ian G. Thompson June 17th, 2009 09:56 AM

The MJPEG is actually 30p @ 30Mbps. It does not look bad at all to me. This cam's 24p is using AVCHD @ 17Mbps. But the image has lots of issues in regards to artifacts. There are some workarounds it seems though. The cam also has 60p @ 17Mbps which is more robust than its 24p mode but you take a hit in resolution.

Anna Uio June 17th, 2009 11:28 AM

Strange... I wonder why they have Mjpg 30p and not also 24p. The bitrate would be lower, or they could up the quality. Seems like a feature that would appeal to a lot of people.

Cheers,
Anna

Steve Mullen June 18th, 2009 08:14 PM

Quote:

Originally Posted by Ian G. Thompson (Post 1159688)
The MJPEG is actually 30p @ 30Mbps. It does not look bad at all to me.

In the prototype, the MJPEG looked much better than the AVCHD clips. It also edits very easily. The 30.0 shouldn't make a difference unless one took long shots.

PS: I'm not sure there are ways to impove the 24p video unless Pana makes a firmware change. If it's the case they limited the bit-rate on 24p to half that of 60p -- that seems a bug. Unless, the AVCHD encoder's compression limit is somehow frame-size dependent.

The "hit" in resolution depends on your point of view. 720p60 has half the spatial resolution of 1080p24 but over twice the temporal resolution. So if you ask if there is a QUALITY hit -- the answer is NO. You simply trade one type of "resolution" for another.

Ken Ross June 20th, 2009 08:33 AM

I've found, judging from broadcast HD, that I much prefer 1080i to 720p. Since most displays are now progressive in nature, the 1080i gets converted to 1080p anyway. Yes, on occasion, you may still see some minor artifacts with 1080i, but the greatly increased resolution of 1920X1080 vs 1400X720 (or thereabouts), more than makes up for that in my opinion. If you view on a large screen HDTV, the differences can be very significant. If you have a smaller screen, or one that is not 1080p, the differences will be less significant.

Keep in mind too that over 95% of material you see on TV is static in nature, or nearly so, thus the advantages of 720p's progressive nature is relatively minor.

As an aside, I should have my GH1 sometime next week. I'm looking forward to this camera!

Steve Mullen June 20th, 2009 04:11 PM

Quote:

Originally Posted by Ken Ross (Post 1161113)
Since most displays are now progressive in nature, the 1080i gets converted to 1080p anyway.

With a 1920x1080 DLP or many plasma HDTVs you do indeed get 1920x1080P. However, with all but a couple of LCDs HDTVs -- you SEE only 1920x540 (540p) when viewing 1080i. Because the vast majority of HDTVs are LCDs -- "most" people see a more balanced resolution when viewing 720p since they SEE 1280x720. In fact, they SEE the same 1Mpixels no matter which format they view because 720p is upscaled to 540p.

Thus, the idea that there is a difference in favor of 1080i is pure marketing. With the push toward very flat HDTVs, DLP's are now rare. And, plasma is all but dead. That leaves LCDs -- and it's the cheap LCDs that are selling the most units. So it's fair to say that within a year only a few will have HDTVs that can show 1920x1080p.

Unfortunately, to SEE the extra 1 million pixels from 1080i they will get all the traditional interlace nasties: line-twitter and flicker, interlace artifacts, and far less clear motion. Bottom line -- 1Mpixels is all most people will ever see. Shooting and recording 2Mpixels makes folks feel better -- due to marketing, but in the end that's all that's needed for an audience.

The horrible interlace artifacts are why the EBU took the stand that Europe should move to progressive as fast as possible. Sony is already shipping 1080p60 cameras and switching gear to meet the needs of Europe. This will require a distribution shift to h.264 and, given our economy, it could be a decade before we in the USA get 1080p60. (1080p24 is far simpler and can be done now.)

PS 1: The only LCDs that can show 1920x1080 are 2 models from Samsung and are 120Hz units. In other words, given HOW LCDs work -- an LCD at minimum must run at 120Hz. However, how do you get 120fps from 30fps or 60fps? Therein lies the rub. The process of upscaling MOTION can introduce artifacts. Getting a clean 120fps from 60p is much simpler than from 60i.

PS 2: All this favors DLP and a few high-end plasmas.

Ken Ross June 20th, 2009 04:21 PM

Steve, I couldn't disagree with you more and I've had both types of displays (720p, 1080p, LCD & plasma). Tested results from totally independent testing magazines consistently show a far higher resolution with 1920X1080. This is not even in dispute. There is just no question that the measured resolution and picture sharpness is far higher with 1920X1080 vs 1280X720p. Watch any good broadcast in 1080i vs 720p and you'll see much more detail in the 1080i broadcast.

My Pioneer Kuro 60" shows FAR more detail than any of my prior 768p Fujitsus ever could. The Fujitsu was considered one of the best back then and the Pioneer is considered the best HDTV ever made.

As far as artifacts are concerned, I suggest you look at a high quality plasma like a Pioneer. You'll be hard pressed to see ANY artifacts, regardless of whether it's 720p or 1080i broadcasts.

I don't think you've seen modern displays if you still think that 1080i is ridden with artifacts. It simply isn't so.

I also have no idea where you're getting your info regarding only 2 LCDs capable of 1920X1080...there are FAR more than that. My goodness, Sony, Samsung, Toshiba, lower end brands....many out there capable of that resolution.

Steve Mullen June 20th, 2009 07:46 PM

Quote:

Originally Posted by Ken Ross (Post 1161264)
Steve, I couldn't disagree with you more and I've had both types of displays (720p, 1080p, LCD & plasma). Tested results from totally independent testing magazines consistently show a far higher resolution with 1920X1080. This is not even in dispute. There is just no question that the measured resolution and picture sharpness is far.

Exactly the reverse -- truly independent testing for the past 3 years has shown the deinterlacer (used only for 1080i) in the MAJORITY of HDTVs fails to deliver more than 330- to 580-lines of VIDEO INFORMATION. That's because "bob" (line-doubling) is used.

http://www.hdguru.com/will-you-see-a...exclusive/287/

Moreover every scientific test has shown greater perceived resolution from 720p over 1080i. These tests are why the EBU voted against interlace no matter the resolution. Moreover, in the ATSC meetings every computer company voted against interlace. See this Broadcast Engineering story:

Restricted Content - Has HD officially arrived in Europe?


Interlace delivers an inferior picture: artifacts and when displayed, no greater REAL resolution. It is a form of compression that was needed because MPEG-4 was not yet available. There is NO support for interlace anywhere except from those who believe they see more because the marketing NUMBERS are bigger. For example, "The rule seems to be that big numbers must be good, because they impress the public. In the move to HD, many networks chose 1080i over 720p, even though careful tests from bodies like the EBU proved that 720p pictures looked better, plus they had the advantage that compression of progressive scanning is more efficient."

http://broadcastengineering.com/view...els/index.html

And, the reason is simple -- even though fields (60i) are converted to frames (60p) by the deinterlacer (a process that often does NOT work well) the VIDEO INFORMATION in each 1920x1080 frame is 1 Mpixel -- exactly the same as from 720p60 which doesn't need to be deinterlaced. Thus, every second, 60 HALF pictures are shown from 1080i while 60 FULL pictures are shown from 720p. Obviously, the eye sees P as having more resolution than I.

The only exceptions are found with certain high-end plasmas (your no longer made becauseonly a few would buy them, Kuro), the 150-inch Panasonic plasma, and a few large Samsung plasmas (which may be discontinued by now).

These few sets use an adaptive deinterlacer which will -- ONLY for static pictures deliver nearly all 1080-lines using "weave." On motion, they use "bob" on those areas with motion. So moving objects have HALF vertical resolution. In theory, because the eye sees objects in motion as blurred, the loss of vertical resolution isn't too noticeable. But, clearly an image made up of soe objects with FULL vertical resolution and some objects with HALF vertical resolution objects has less QUALITY than a solid 1280x720 image.

Moreover, adaptive deinterlacers need to gracefully switch between bob and weave AND need to switch between interlace and progressive (film). This is nearly impossible to do without switching artifacts.

Bottom-line -- it is irrelevant what you claim you see. Industry experts differ strongly. The technology of interlace inherently does NOT deliver high-quality video. At the camera there is a 25% loss of vertical resolution from the sensor. This low-pass filtering is necessary to TRY to prevent line-twitter. However, watch a vertical pan of the stands in a game and you'll see the attempt does not work. Twitter is horrible. Now watch ESPN or FOX. No ugly twitter!

Then, unless you own one of a few "good" HDTVs, the deinterlacer cuts the ALREADY LOWERED BY 25% vertical resolution by HALF. And, even if you own one of the few better HDTVs, objects in motion lose HALF vertical resolution.

LCDs have a different problem. Only ONE 120Hz Samsungs deliver near 1080-lines of motion VIDEO RESOLUTION. All other 120Hz LCDs on the market failed to deliver more than about 580-lines and 60Hz LCDs generally delivered only 330-lines. Yes -- only 330-lines.

This situation may be because Samsung put an adaptive deinterlacer only in their top of the line 120Hz HDTV. Or, the way LCDs write to a panel -- only half the rows get NEW information for each re-fresh. Thus, doubling the number of refreshes doubles the number of lines that can be presented. If it is the former -- then 720p will be perfect. If it is the latter -- then both 1080i and 720p will be cut to 540p60 unless one buys one of the top of the line 120Hz Samsungs. (I'm investigating which reason is true.)

My Broadcast Engineering story at:
http://files.me.com/dvcinlv/75cegw

And, my story in Broadcast Engineering at:
http://broadcastengineering.com/test...ors/index.html

Ken Ross June 20th, 2009 09:42 PM

Gary Merson's testing was done almost two years ago. I know Gary, he lives near me and had done an ISF on an old Panasonic CRT HDTV I had owned.

What you may not be aware of, is that Gary did a subsequent test much later and found that the state of deinterlacing had MUCH improved in almost ALL HDTVs he tested. Modern deinterlacers do not achieve their end-result by simply bobbing. In fact Gary's later testing found far greater resolution even for motion testing. Those lower rez days are largely over. Horizontal and vertical resolution is FAR FAR higher than what you quoted for a good quality 1080p HDTV. Your numbers make very little sense Steve since they're almost the same as those achieved for SDTV. Now please please, don't tell me I'm seeing very little more resolution with my HDTV on 1080i than I did with my old standard def TV. C'mon now, there isn't a person that's seen any HDTV I've ever owned (or any that any person ever owned) that wouldn't laugh out loud at that comment.

The Pioneer Kuro has been measured significantly higher with a 1080i source as have many other HDTVs Gary has tested. In fact, resolution down to the pixel level has been achieved...the FULL 1920X1080. I am most assuredly not alone is seeing much greater detail in 1080i broadcasts as opposed to 720p. The eyes see it and the measurements prove it. Again, the vast majority of broadcasting is static or nearly static (yes, even sports) in nature and so even with TVs that don't do a great job of deinterlacing, artifacts are not significant.

720p requires much less bandwidth and it was that reason that some broadcasters chose it. However, more chose to go with 1080i as opposed to 720p.

And Steve, of course computer companies voted against interlaced, it's contrary to their best interests! They live in a progressive world so to speak. :)

Steve Mullen June 21st, 2009 03:18 AM

Quote:

Originally Posted by Ken Ross (Post 1161332)
Gary Merson's testing was done almost two years ago. I know Gary, he lives near me and had done an ISF on an old Panasonic CRT HDTV I had owned.

What you may not be aware of, is that Gary did a subsequent test much later and found that the state of deinterlacing had MUCH improved in almost.

The Pioneer Kuro has been measured significantly higher with a 1080i source as have many other HDTVs Gary has tested)

Gary's last test was published on 2008 models and if you read my link you know it reports that deinterlacing did get better from 2007. IF 2009 models test better, we'll have to wait until he publishes these tests. In any case, let's assume the LCDs do improve because they have been forced to move from 60Hz to 120Hz to solve motion problems. With only 330- to 580-lines of rez in 2008 -- they'll have to improve a LOT to really display anything near 1080-lines.

You are also ignoring the EBU studies that 720p60 delivers a BETTER picture than does 1080i. And, you are ignoring that 1080-line sensors running at 1080i are filtered to about 810-lines. There are NO 1080i sources other test instruments. That's why in the real world you can't see the difference between 720-lines and 810-lines. In fact, unless your de-interlacer can deliver 90% or more of the vertical resolution on a 1080i test signal -- a number almost impossible to achieve -- monitors using 1080i will display LESS than 720-lines.

So despite the claims of FullHD -- real world video programming on 99% of HDTVs will have vertical resolution no better than does 720p. And, in the case of all but a very few camcorders, the CCDs and CMOS chips can NOT deliver even 1000-lines let alone 2000-lines of horizontal resolution. So, other than $50K to $100K video cameras -- nothing records anything near FullHD. Most camcorders are hard pressed to record 1280x720.

Only film transfers on BD are really FullHD -- and these are PROGRESSIVE. But, that's not the topic of this GH1 thread.

Lastly, your Kuro did reach 900-lines under motion and if you read my BE stories you know it is the ONLY HD monitor I recommend. So, I full agree your Kuro delivers more resolution than almost any other HDTV. Unfortunately, none of these plasmas are being built anymore. (: Folks couldn't see enough difference to pay their high price.

So, now only the 150-inch Panasonic and one Samsung reaches this level of performance. How many folks have 150-inch plasmas at home?

ESPN, FOX, ABC, and NASA did not choose 720p for no good reasons. The EBU did not recommend 720p50 for no good reasons. In fact, I know of no study that has ever recommended 1080i. CBS and NBC chose it because the BIG NUMBERS allowed their marketing to claim it was the best. FullHD is a marketing tool. Like Red's "4K" which ain't 4K as the term is used by the rest of the industry.

PS: when we get 1080p60 which will neither be filtered before recording nor deinterlaced before display, there really will be 2Mpixels of information to be seen. Behind closed doors I've seen Sony 1080p50 shot in Europe for IBC and presented on an OLED monitor. This truly will blow away 720p50 -- which is exactly why Sony is aiming it at Europe. They are saying skip 720p50 and go directly to 1080p50.

Ken Ross June 21st, 2009 09:29 AM

I think it's a bit arbitrary to say that CBS, NBC, PBS, CNN, HBO, Showtime etc. etc. went to 1080i because of 'marketing'. You mean to tell me that these guys somehow fell prey to marketing and ABC, FOX (lesser networks with less capital to work with) and ESPN were the only 'honest' guys? Not buying it Steve nor do my eyes or my logic.

As I said before, much of your information was outdated, including your belief that only ONE manufacturer, Samsung, was providing 1920X1080 displays. There have been many more manufacturers than that providing full rez displays for several years. And the limitation has nothing whatever to do with refresh rates, it has to do with pixel density. Higher refresh rates in LCDs are being used to provide a 'smoother' image with motion and get around motion lag which has been an issue with LCDs for many years.

Hell, I could clearly see a better, more fully resolved picture on 1080i vs 720p with my 768p Fujitsu plasmas! It wasn't just the 1080p Pioneer that showed me the picture was better, although the Pioneer certainly magnified the differences tremendously. The Pioneer simply demonstrated more vividly that the 1080i SIGNAL was superior to the 720p. Better displays will show the differences to a greater degree, but it clearly shows there are differences that are far from subtle. Good deinterlacing is no longer the sole property of high end displays, many mid displays have very good deinterlacing.

Good displays show it clearly and no advertising, marketing hype or outdated papers with outdated findings can change that. Japan also chose this resolution and I don't think they're prone to this 'marketing hype' either. This is not to say that 720p can't display a beautiful picture, it can. But there is simply no getting away from the fact that there is FAR more detail in a 1920X1080i signal than 1280X720p...unless the math I grew up with has changed...and that's not change I can believe in. The 1080i signal requires more bandwidth for a good reason, it carries more information! I've also read that one of the reasons NASA chose 720p, was bandwidth constraints. And if you think about, it makes perfect sense. For NASA, bandwidth IS a critical issue.

There have been tons of people who see the same thing and they're not the least bit privy to 'marketing hype'. In fact you hear almost no marketing hype at all from any network anymore advertising 'our 1080i is better than their 720p'. They don't have to, it shows. Honestly, for the life of me, I have no idea how anyone could not see the difference unless they're watching on a small HDTV or a pretty poor one. Virtually everyone I know with an HDTV has commented on it at one time or another. Some are not even aware of why it is and have no idea of the underlying resolution differences.

The same holds true for camcorders. There is an obvious increase in detail going from from a 720p camcorder to a 1080i unit. Yes, neither will resolve a full rez signal due to other limitations, but you'd have to be nearly blind not to see the greater detail in the 1080i signal which will generally have both a higher horizontal and vertical component in detail.

I should also point out at this time that some of the 'artifacts' that people complain about with broadcast HDTV has nothing to do with the nature of the signal, but rather with multicasting! Multicasting has been one of the true detriments to high quality broadcast HDTV. When a broadcaster decides to multicast on 4 or more different channels, it plays total havoc with the available bandwidth for the main HD channel. This has nothing to do with 'i' or 'p'. But I've seen many people confuse this issue as the result of multicasting.

Steve, again, you need to appreciate the fact that 95% of all broadcast material is static or nearly static in nature. A fact the computer guys don't like to mention since they will forever push that everyone conform to their progessive world. That fact alone virtually eliminates any of the old 'artifact arguments'. Several years ago I had a conversation with one of CBS' top engineers and he mentioned to me how misunderstood this concept was. He emphasized how even with sports, the vast majority of the progamming is nearly static in nature.

Emmanuel Plakiotis June 21st, 2009 06:29 PM

Steve,

I can confirm you, that while in IBC two years ago, I have read on the free IBC mag, EBU's finding - based on real life tests on consumer sets - 720p looked better than 1080i.

Your argument is also supported by the fact that even in their latest camcorder (800) Sony halved the vertical resolution of the slow motion (1920X540 at 60P). It was not a hardware limitation. They knew that nobody would see the extra lines on their TV's and at the same time they protected their digital cinema gear.

In the same manner NTSC SDTV is only 240p and that illustrates the real differences between the resolution of the formats. It also explain why 480p (especially uncompressed) looked so good compared to HDTV (even when transfered to film).

Ken Ross June 21st, 2009 06:49 PM

As I've said, processing has improved considerably in HDTVs. The resolution is NOT half when displaying & viewing 1080i. Test after test after test has proved this. It's time we don't live in the past folks. If you believe 720p looks better to you, then watch 720p. But I've found most diehards for 720p are diehard computer people. The industry lives in the world of 'progessive' and wants everyone else there. I'd more than welcome 1080p for all sources, but until that happens (and it won't happen for a long time), I'll gladly take 1920X1080.

But there are tons of people out there that have seen both and agree there is significantly more detail in 1080i. I would have to be blind when watching my 60" Pioneer Kuro to not see the additional detail.

Please also keep in mind that when viewing MOVIES, this entire argument is meaningless. You will get the full 1920X1080 (or whatever the movie's actual telecined resolution was) when viewing a BD movie. If 720p was so wonderful, then the movies too would have been mastered at 720p. It is most certainly not the case. If 720p were so wonderful, we'd see HIGHER priced 720p camcorders relative to their 1080i counterparts. This is not the case and the professional broadcast arena's highest priced models are 1080.

As for 480p, yes it can look good if the upconversion is done well. But one only needs to compare that same movie to a BD release to see the huge difference. If you can't see that difference and see it easily, it's time to get a new HDTV or, at the very least, a larger HDTV. I just don't know what else I can say, but it seems that many people here are not seeing the full glory our HDTV standard has given us...and that's a pity.

At any rate, I'm surely looking forward to the arrival of my GH1...I don't know how we got off on this tangent. Fortunately for everyone, the Panny shoots in both modes. However, it looks like there is insufficient bandwidth for smooth motion in the 1080p mode according to some. We shall see.

Steve Mullen June 21st, 2009 07:21 PM

[QUOTE=Ken Ross;1161466]
As I said before, much of your information was outdated, including your belief that only ONE manufacturer, Samsung, was providing 1920X1080 displays. There have been many more manufacturers than that providing full rez displays for several years./QUOTE]

Ahhh. I see why you are so confused. You are following the "logic" of most consumers who believe that what counts is the number of pixels. This is why companies adopted the FullHD marketing approach. Make panels with 2 million pixels and call it FULL hd. (The same marketing is used with cameras. Make cameras with 2 million pixels and call it FULL. hd)

What engineers, and what Gary blew the whistle on, is that it's the amount of INFORMATION that's fed to the panel (and recorded from the sensors) that is the real limiter of what you see.

Since I have enough knowledge on this topic to have my information published in the highest high level video engineering magazine world-wide, aren't you a little over your head in this discussion? Since BE publishes my information to educate video engineers, what are the chances that were I wrong on any of this, I wouldn't receive instant feedback from people like Larry Thorpe ("inventor" of CineAlta while at Sony). Who by the way, not only also writes for BE -- he reads my stuff.

You have the links by which you could actually educate yourself. As long as you want to talk about what you see and believe -- this exchange is a waste of time.

PS1: BACK TO THE GH1. The reason folks report seeing more resolution from 1080p24 than from 720p60 is because 24p is recorded with 2-3 pulldown. The virtue of pulldown is once a GOOD deinterlacer (software or hardware) detects pulldown -- it switches from video to film deinterlacing. It can now use cadence to reconstruct perfect 24p frames. These frames have ALL the information in the original 24fps frames -- unlike interlace video frames where information can NEVER be fully reconstructed.

These 24fps frames are then repeated in a 2-3 pattern to yield 60p or repeated 3 times in a Kuro (72Hz) or repeated 5 times for a 120Hx LCD or repeated 10 times for a 240Hx LCD.

The Kuro was the only monitor that can CORRECTLY present film without 2-3 pulldown because 3X is the highest acceptable repeat rate for film. Once 5X or 10X is used "film" looks like "video." And, 60p viewrs get 2-3 pulldown which is not what you see in a theater.

PS2: 720p passes thru deinterlacers and only needs to be scaled up to 1920x1080. And, if the video is 720p25 and 720p30, each frame is repeated twice to generate 60p or 50p -- just like when film is projected.

720p25 and 720p30 are the only video formats that when viewed on a flat panel monitor create the same experience as watching film in a theater! In fact, with a 120Hz LCD that displays a black frame after each video frame (MOST DO NOT DO THIS) you'll see similar to what a projector throws onto a movie screen! (frame > black > frame > black)

1 2 3 4
1b1b3b2b3b3b4b4b
OR
11bb22bb33bb44bb


PS3: This is why folks who do NOT like the look of film hate 720p25 and 720p30. The motion judder is that of film. Which is why they want to shoot 720p50 and 720p60.
CONVERSLY, if really want the look of film projected in a theater, you should shoot 720p25 and 720p30 -- not 1080p24.

Khoi Pham June 21st, 2009 08:17 PM

All I can say is 1080i has much more details and more wow factor than any 720P material I've ever looked at, don't care what numbers you come up with Steve. (-:


All times are GMT -6. The time now is 11:25 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network