DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   General HD (720 / 1080) Acquisition (https://www.dvinfo.net/forum/general-hd-720-1080-acquisition/)
-   -   My thought - 1080i is dying (https://www.dvinfo.net/forum/general-hd-720-1080-acquisition/45070-my-thought-1080i-dying.html)

David Kennett May 23rd, 2005 02:18 PM

My thought - 1080i is dying
 
I was playing around a little the other day, and I tried some experiments with progressive and interlaced video. Without going into detail, I had no trouble deciding that progressive video really needs to be seen on a progressive display, and interlaced on an interlaced display. If you convert, you give up the advantage of either. And they each DO have their unique advantage.

The problem is, the only really good interlaced display is the CRT. The glow of the phosphors decays, so that the two fields look like motion, not a single image divided into skewed horizontal stripes. I once saw a still frame of a spinning top that had not been de-interlaced. It just sat there spinning forever! This was several years ago, obviously on an interlaced CRT. A progressive image is also degraded when seen on an interlaced display. Are you with me so far? In a nutshell - either format loses its advantage when it must be converted to the other!

Now comes the rub! What kind of monitor (TV?) will be showing tomorrows pictures? A recent trip to Best Buy yielded a grand total of TWO CRT rear projection sets. About twenty other models were LCD, DLP, or D-ILA. There were a couple other isles with a variety of plasmas and back-lit LCDs. There were a couple monster CRTs of the 35 or so inch variety. What's left? A bunch of SD CRTs of varying sizes. There was still a pretty good variety, and they were CHEAP. How many CRT sets will be sold five years from now?

NOTHING has shown up yet in the store to display 1080i native - except for the CRTs. I have yet to see a CONSUMER CRT that looked sharper than the variety of 720 x 1280 sets. And contrast ratios, colorimetry, and SDE are getting better all the time.

So where will all this super video be seen? Probably only on the commercial grade monitor at the production facility.

Keep in mind too that broadcasters, cable and satellite companies will all be frugal in allocating their spectrum. 1080p30 might have a chance, and 1080p60 could make it to HD disc - but then where would it bee seen?

I see the CRT dying in a few years - and with it 1080i.

I'm sure there are those who disagree with me - that's why we call it a forum. Have at me!

Radek Svoboda May 23rd, 2005 02:39 PM

1080i shows and other programs are hatching lot fater than 720p's. After 1080i's die, what will replace them? 1080p. You imply that 720p's will not be dying with 1080i's? They will be awfuly lonely in 1080 world.

Future sets will be 1080p. I seriously doubt that programs upconverted from 720p will look better on them than programs upconverted from 1080i.

35,000 1080i HDV cameras were sold. How many 720p HDV cameras were sold so far?

Radek

Thomas Smet May 23rd, 2005 03:40 PM

"Future sets will be 1080p. I seriously doubt that programs upconverted from 720p will look better on them than programs upconverted from 1080i."


Actually vertical detail would be better coming from 720p instead of 1080i if you are going to 1080p. Depending on how the 1080i is interlaced you would only really get 540p for vertical detail.

As for horizontal detail well the Varicam shoots it's 1080p at only 1280x1080 so 1280 horizontal pixels could give a very nice image. This is even more so considering most 1080i cameras are not giving 1920x1080 but only 1440x1080.

Take all of that and in the best case you could get 1440x540p from a 1080i source. The 1280x720 actually has roughly 19% more raw pixels. Also take into consideration that de-interlacing can give you aliasing artifacts in an image which can be more obvious than just having a softer image from scaling up.

About the only real advantage to current 1080i HDV is we do get a little bit more chroma detail. With 1280x720p the chroma channels are only 640x360. 1440x1080i has 720x540 chroma channels. The other advantage is that with 1080i all of the scaling and de-interlacing is done before compression. 720p will be scaled after compression.


"35,000 1080i HDV cameras were sold. How many 720p HDV cameras were sold so far?"


I think this may change soon with other 720p HD cameras coming out soon. Since SONY is the only really decent prosumer HD camera out there right now you cannot really use this as a point. When Panasonic and JVC, and anybody else come out with their cameras they will be 720p (mostly?) This time next year 720p gear could way outsell 1080i gear.



Now if and when we do get 1080p cameras and TV's well this debate won't happen anymore.

Mark Grant May 23rd, 2005 04:27 PM

Quote:

Take all of that and in the best case you could get 1440x540p from a 1080i source.
Not true at all. In the best case you get 1440x1080 on a completely static frame. In the _WORST_ case you get 1440x540 on a frame full of motion, but then you probably won't even notice since it's likely to be blurry as hell anyway.

In the average case, with any kind of smart deinterlacing you're likely to get a picture at least as good as 720p when you convert from 1080i to 1080p... and with more horizontal resolution.

Radek Svoboda May 23rd, 2005 05:20 PM

>>>Actually vertical detail would be better coming from 720p instead of 1080i if you are going to 1080p. Depending on how the 1080i is interlaced you would only really get 540p for vertical detail.

You're converting to 60p. You'll get lot more.

>>As for horizontal detail well the Varicam shoots it's 1080p at only 1280x1080 so 1280 horizontal pixels could give a very nice image. This is even more so considering most 1080i cameras are not giving 1920x1080 but only 1440x1080.

Varicam shoots 720p at 960x720 pixels.

>>Take all of that and in the best case you could get 1440x540p from a 1080i source. The 1280x720 actually has roughly 19% more raw pixels. Also take into consideration that de-interlacing can give you aliasing artifacts in an image which can be more obvious than just having a softer image from scaling up.

This not correct.

Thomas Smet May 24th, 2005 12:56 AM

Ok then find me a camera that actually does smart de-interlacing. The whole point of this topic was hardware and not software. Yes you can use a smart de-interlacer in software but any slight movement will need to be de-interlaced. If you had a ball slowly moving across the screen 3 pixels at a time it would be interlaced. That ball would also be pretty sharp since it is moving very slow. It all depends on what you shoot. Even the slightest movement of the camera will cause the image to be interlaced.

Most people will not be using a HD camera to be shooting perfectly still scenes. I am talking about real world usage of a 1080i camera. If you handhold that camera I don't care how sturdy you think you are every single part of every single frame will be made up of interlaced lines. Any very slow pans would also be 100% de-interlaced but still very sharp. Yes if you shoot pretty much still frames you would have a true 1440x1080 image but that actually happens much less than you think.

If we depend on a camera or TV to convert our 1080i footage into 1080p chances are that it will be a simple de-interlace conversion. That is why I said in a best case you may only really get 1440x540p. Yes in software on certain shots you may get better but we are talking about consumer hardware here.

It is also a question with the FX1/Z1 if we even get 1440x1080. Since the chips are only 960x1080 but use pixel shift I decided to test this out. Even though there is a little bit more detail than if it was only 960x1080 it isn't really 1440 either. It is really somewhere in between making it even closer to 1280. Heck if pixel shift worked that well SD cameras could get by with chips that only had 1/2 the amount of pixels.

The Panasonic DVCpro HD format has two specs for 720p and 1080p/i. 720p is 960x720. The 1080 format is 1280x1080. The newer Panasonic Varicam as well as the new handheld Panasonic HD camera use both of the DVCpro HD specs. The older camera only does 720p.

Now don't get me wrong here. I'm not really more for one format or the other. I'm just letting you guys know what I have been finding out with a lot of complex image tests.

What I have been finding out is that at the end of the day it is pretty hard to tell the difference between 720p and 1080i on a HDTV. Certain types of HDTV's may have a slight edge for a certain format but it is very very far from a night and day situation.

Eventually when HDTV's and cameras can do 1080p we will not need this debate anymore. Clearly 1080p would be better than 720p. Would you try to say that 1080i is as good as 1080p? Clearly it isn't or nobody would care about 1080p. The issue at that point in time would then be what old footage will look better on a 1080p display? 720p or 1080i. I think they can both look good if done right.

Mark Grant May 24th, 2005 04:16 AM

Quote:

If we depend on a camera or TV to convert our 1080i footage into 1080p chances are that it will be a simple de-interlace conversion
Yes, but who in their right mind would do that? If you want to get 1080p footage out of a 1080i source, you'll be doing it in software after you edit.

You seem to be deliberately going out of your way to find artificial conditions in which 1080i would look its worst, and then complaining that it's worse than 720p at its best.

Edit: I'd also add that, since I don't have an HDTV, I watch all my HD footage on my PC monitor, which is effectively 1080p. The software doesn't deinterlace the footage, it displays 540p at 50fps with 2x scaling, which is the other alternative for 1080i on a progressive display and doesn't introduce nasty motion artifacts since you see all 50 fields.

Tom Roper May 24th, 2005 10:09 AM

Quote:

Originally Posted by Thomas Smet
What I have been finding out is that at the end of the day it is pretty hard to tell the difference between 720p and 1080i on a HDTV. Certain types of HDTV's may have a slight edge for a certain format but it is very very far from a night and day situation.

I agree. And my take is that neither format is "dying," in fact both are flourishing and will continue to do so. Each will have its detractors and proponents, but the bottom line is that with slight differences between how the images are presented and how motion and color are handled, both end up being about equal in an overall sense.

Thomas Smet May 24th, 2005 03:30 PM

I am not going out of my way to do anything.

This whole thread was started based on watching HD footage on HDTV's and how the two HD formats will look and if the market will support both.

I even ended by saying that really both are pretty much the same. All I ever pointed out is that with standard de-interlacing done with hardware and some software the best you could really have is 1440x540 detail. I never knocked one format or the other. Did I ever say 540 would look like garbage? I just gave the numbers and you are coming to your own conclusions based on what I said. I gave facts. You are giving opinions that it looks good. I agree it does look good. At the end of the day however de-interlacing 1080i or scaling up 540p will give you the same interpolated results.

I gave my information on what I know about 720p vs 1080i to help figure out where the market could be going in terms of HDTV's. This debate is starting to turn way too political to be worth it anymore. You have the 720p people and the 1080i people. As soon as somebody tries to compare the two, somebody gets upset and starts treating them like they are a moron.

As to who would want to use hardware to convert their video. Well anybody. When ever you plug your camera into a 720p HDTV it is getting converted. We are talking about hooking up gear and watching on a TV here. Not processing footage in a computer.

Radek Svoboda May 24th, 2005 03:48 PM

The most primitive deinterlacer is line doubler, after discarding every other field. I own FX1E and can tell you you get lot higher resolution, even on movement, in CF25 than with 540 lines scan. So even inferior hardware deinterlacer is lot more than is being discussed here.

Radek

P.S.

Varicam records 720p, never 1080i.

Thomas Smet May 24th, 2005 09:40 PM

http://catalog2.panasonic.com/webapp...Model=AK-HC930


Ok it isn't the Varicam but it is the newer Panasonic HD camera. The point I was getting at is the HD format for Panasonic DVCpro100 HD not so much the camera. Look at the DVCpro HD codec in Final Cut Pro and you will see it has 720p and 1080i. Any camera that uses DVCpro HD will use the 1280 x 1080i format.

Barry Green May 25th, 2005 12:21 AM

Quote:

Originally Posted by Thomas Smet
Any camera that uses DVCpro HD will use the 1280 x 1080i format.

Not necessarily. The VariCam has no access to the 1080i version of DVCPRO-HD.

The codec and the recording format support both 1080i and 720p. However, the individual cameras built to use the format do not. The VariCam uses only 720p; the HDX400 uses only 1080i.

The new HVX200 will be the first DVCPRO-HD camera that supports both 720p and 1080i (and 1080p carried within a 1080i stream).

Radek Svoboda May 25th, 2005 12:38 AM

The camera signals Panasonic will deliver 1080p even on larger cameras. They mentioned at NAB they can use the 10 bit D5 format with MPEG4 encoder to deliver super quality 1080p camera. It's about time.

Radek

Thomas Smet May 25th, 2005 10:03 AM

Sorry Barry. I meant any 1080 Panasonic camera.

Nick Hiltgen May 30th, 2005 12:33 PM

Supposedly the next generation of f900's will be able to resolve 1080 60p which should nicely integrate the features you were talking about. I think however it's difficult to say that it's the death of one format or another when most people don't have either. The original problem with 1080p format was broadcast issues. I believe that most of the stations that are going up to HDTV will not switch their signals again in the near future. Expecially with the agreed upon specs from the ITU and SMPTE.

Steve Crisdale May 30th, 2005 06:48 PM

How limited the view can be from 'The Top'...
 
Must be something about getting light-headed from the brain crippling lack of oxygen.

As much as I'm tempted from my stand-point of HDTV/HDV camcorder early-adopter viewpoint, to prophess insights into the future of the technology and it's delivery; I'm also aware that the bigger picture regarding High Definition, revolves around it's mass distribution and penetration rate into, what till now, has been an analogue and SD viewing world.

Joe Bloggs couldn't give a rat's about 1080i vs 1080p regardless of the frame rate. He does care about having to buy a new TV because transmissions of Free to Air programing are going digital, and how much that's going to cost. Many people faced with such a decision won't even move to a WS Enhanced Definition set - let alone a HD capable option.

No manufacturer is going to make any decision about HD resolution 'supremecy' until they are certain that a particular HD mode is truly redundant. God!! Even the EU can't make up their mind about whether to go 1080i or 720p, but regardless of whether they do or not, you'll still be able to watch 1080i material very nicely indeed on a HDTV with 1280x720 resolution. Many countries that do have HD broadcast guidelines have already given broadcasters the option to broadcast 720p or 1080i, so there'll be plenty of HD material being gathered for both.

It's kinda wierd to be hearing this guff.

I actually feel privileged to be able too afford to shoot the sort of image quality - maybe even better - that even major broadcasters could only barely afford just a year or two ago.

The current crop of HDV camcorders represent a level of inclusion in cutting edge technology for those who may have previously been excluded, despite their desire and capability, from a very exclusive industry. That's gotta be pretty damned scary for quite a few in the Television and Entertainment industries!!!

1080i is dying... Give me a break!!!

Radek Svoboda May 31st, 2005 02:21 AM

>>Many countries that do have HD broadcast guidelines have already given broadcasters the option to broadcast 720p or 1080i, so there'll be plenty of HD material being gathered for both.<<

Except for USA, show me one country with 720p broadcast.

Matsushita and Sony are world's foremost electronic manufacturers, no longer fight like in vHS-Beta era. Now they cooperate like with Blu-Ray. They don't want 1080i-720p war, instead they will both pushing for 1080p.

Radek

Roland Clarke May 31st, 2005 04:09 AM

In the UK the take up seems to be overwhelmingly biased in favour of the new Sony 730 and 750 along with a bunch of Z1 HDV cameras and of course the F900. On this basis alone 720p isn't the way its going in the UK and I would surmise this is true for the rest of Europe. Sony's dominance in the broadcast field is to a large extent dictating the expansion of HD.

Being that the freelance field in the uk is so prevelant compatability is a real issue if you want to keep busy. You just have to look at the equipment that is offered for hire here (cameras/VTR's), almost all of it is Sony.

Regards



Roland

Radek Svoboda May 31st, 2005 05:28 AM

I'm in Czech Republic. It's all Sony here too. If you want shoot something for TV you'd better shoot on Digital Beta.

They don't even plan HD here for now, only started digital broadcast. Soon will be all digital but not HD.

For film rarly CineAlta is used; it's all 35 mm and preferd stock is Fuji, not Kodak.

It's all either film or Sony.

Radek

Duane Smith May 31st, 2005 06:21 AM

Quote:

Originally Posted by Steve Crisdale
1080i is dying... Give me a break!!!

Amen. Thanks for standing up and saying that. :-)

Steven White May 31st, 2005 07:47 AM

Yeah, I can hardly imagine 1080i is going to die.

It really looks to me like Sony is going to flood the market with 1080i content producing devices on their low-end (HDV) and 1080p cameras on their high end. The shear volume of 1080i content produced with over a year in the market will dominate production.

Furthermore, when the HVX200 arrives, I predict that the majority of users will do everything they can to shoot 1280x1080p to avoid crippling the camera's horizontal resolution with the 1.5 PA ratio of 960x720p. (though I also expect they will use the 1280x1080p material to create two versions of their content: a 1280x1080p version and a 1280x720p version). The only people who will be producing predominantly 720p content will be the JVC HDV and Varicam users... and the Panasonic is releasing 1080 high-end cameras as well.

This year display manufactures are working there way towards native 1920x1080p televisions, and I expect the winning HD-DVD/Blu-ray format will store movies in 1080p mode as well.

It's a 1080(i/p) future IMHO. Of course, it's a 1080(i/p) future with crippling compression in broadcast... but that's aside from the point. Bigger numbers will win - and since when people buy TVs the stores always sell with high-bitrate streams, more 1080p televisions will be purchased.

-Steve

David Kennett May 31st, 2005 10:33 AM

1080i works great on a CRT. It is less than stellar on anything else. I think CRTs will eventually die, hence interlaced scanning becomes less valuable as a tool to get additional resolution while keeping apparent frame rate at 60.

Try this (with 1080i if possible) - I know it works this way with 480i. Take a still frame with some identifiable motion, such as a waving hand. Capture the frame WITHOUT de-interlacing. Import the frame into your favorite editor and make a five second clip. If you look at this in NATIVE format on an interlaced CRT, there will appear a flickering motion of the hand between the two frames. Displaying this image progressively will cause the image of the hand to disintegrate into a single image with even lines showing one position, and the odd lines showing the other position, putting apparent horizontal streaks in the hand. If we de-interlace, we loose much of the additional resolution we tried to gain with interlaced scanning in the first place. Only with interlaced scanning AND display do we retain the advantage of a 60 frame rate for motion with the advantage of the lower data rate of 30 frames.

I don't have all varieties of monitors and signals at my disposal. Things might look different with different combinations. My challenge is this: try different formats displayed on different monitors and see for yourself. If the hand doesn't flutter, you have lost the advantage of interlacing. It might be helpful to include a resolution chart in your test footage.

Keep in mind that 720p60 and 1080i30 were chosen because they are both about the same data rate. If you want 1080p60 then you pay the penalty of twice the data rate. Interlaced scanning is a great way of "having your cake and eating it too", but interlaced monitors are slowly disappearing.

Post your findings with different combinations here, and perhaps we can all learn something.

Radek Svoboda May 31st, 2005 02:30 PM

If you have 1080p60 monitor,you can show 1080i60 without problems. Or can convert footage to 1080p60 and have even better image.

There is misconception that interlaced not look good on progressive displays. It goes like this: I use 1280x720 display and 720p looks better on it than 1080i. Reason is that 720p display could only show 720i at most, if it is made to handle interlacad at all. Normally it is not, it's using some cheapes deinterlacer to create 720p.

In era of sophisticated 1080p displays it will not make much difference if shot in 720p ir 1080i.

Radek

Jack Zhang June 2nd, 2005 11:36 PM

Quote:

Originally Posted by David Kennett
If you want 1080p60 then you pay the penalty of twice the data rate.

H.264 will solve that! 1080p60 @ 25 Mbps at 4:2:2 in H.264 will result in a perfect 60p picture at the same bitrate as MPEG-2 HD interlaced.

P.S H.264 is also called MPEG-4 Part 10

David Kennett June 3rd, 2005 01:08 PM

Jack,

It's true that improved compression will help. But of course it can make everything better. One can take the advantage as lower data rate or higher quality. I have never argued against the higher quality - 1080p60 would be great!

My argument is simple. The advantage of interlaced scanning is fully realized ONLY on an interlaced monitor. Since interlaced monitors (CRTs) are disappearing, so will the advantage of interlacing. It just doesn't make sense to me to create an interlaced image, only to have to de-interlace it (losing resolution and creating other strange artifacts) to show it on the only monitors existing (in a few years).

I used to be a staunch believer that the advantages of interlaced scanning were worth the trade-off. Smaller flat panel LCD displays are dropping in price very quickly. At a certain price point, the CRT will disappear. I know! There are some advantages to the CRT, but that is only meaningful to a small group.

It's just a gut feeling I get as I observe the monitor market. Besides, I was wrong once before. Oh! That's the time I THOUGHT I was wrong!

Duane Smith June 3rd, 2005 04:18 PM

Okay, the computer MONITOR market has changed from CRT to LCD fairly quickly (probably around 5 years?), but that hasn't even come close to happening in the television market...and it's probably still several years away because the non-CRT prices are just nowhere near competitive. Just go look at any Best Buy store -- heck, even their website -- and count the number of small televisions; notice how many are CRTs, and how much cheaper they are?

21" - 29" TVs
-- 34 CRT
-- 11 NON-CRT
Cheapest CRT = $139
Cheapest NON-CRT = $949

30" - 39" TVs
-- 33 CRT
-- 10 non-CRTs
Cheapest CRT = $259
Cheapest NON-CRT = $1199

Non-CRT monitors have a LONG way to go before Joe Average buys one for his living room television. Sure, in the big TV (36"+) market the CRT isn't competitive, but on the smaller end of the spectrum -- you know, where most tvs exist -- it's a CRT world. It could be another 5+ years before non-CRTs are as competitive in the television world, and who knows how long before non-CRTs dominate the installed base of televisions. A decade or two?

Claiming 1080i is "dying" is no more valid than claiming a healthy newborn baby is "dying". Sure, the baby will EVENTUALLY die, and 1080i will also certianly EVENTUALLY die. But like a baby, 1080i is just now entering the beginning of it's lifecycle, and it's going to be around a long time.

Kevin Dooley June 3rd, 2005 04:27 PM

Okay, aside from the whole 1080i market, I'm pretty sure that the movement away from CRT's won't take another 5 years. LCDs (and other assorted technologies) just keep dropping in price. They're getting closer and closer to competitive every month.

And if the SED technology does what it promises to do, they'll be competitive in another year and a half at the maximum... of course that's a big if, but I don't see DLP, LCD, D-ILA, and Plasma taking another 5 years. They've gone through the stage where they're just dream TVs or TVs that only the rich have and they're now at a point where pretty much everyone I know is looking at them as more and more of an option... Sure, it's not going to replace the 13" on my buddies kitchen counter soon, but if any of my friend's main TV goes down, I don't know a one of them that wouldn't seriously consider spending a couple hundred more to get an HD flat panel (or at the very least the abonination that is EDTV) of some sort...

Steve Crisdale June 3rd, 2005 08:40 PM

Quote:

Originally Posted by Kevin Dooley
Okay, aside from the whole 1080i market, I'm pretty sure that the movement away from CRT's won't take another 5 years. LCDs (and other assorted technologies) just keep dropping in price. They're getting closer and closer to competitive every month.

Just because some company designs and manufactures a product that one individual thinks is revolutionary, desirable and competitive, doesn't mean that every person will: a) think the same. b) buy it simply because they can or c) have enough available cash to lay out on what they may feel is a rash and unwarranted purchase.

If I saw the World as just the area within my own cone of vision, I'd possibly also be beguiled into thinking that LCD WS HDTVs of $4,000+ are going to be quickly bought up for every household. My friends have a different set of priorities in life to me however.

Any change of TV for them will be one that is forced; not by Government guidelines or technological developments, but by total current equipment failure. And what they'll buy to replace any such clapped out set, will be determined by price... certainly they won't be debating the merits of interlaced versus progressive. Most of my friends couldn't give a bugger about HD, because they have families and lives to worry about.

I'm the odd one out in the current video/TV equation... not the vast majority of the populace, and it ain't gonna be 5 years until that changes. Think more like 10 to 15 AT LEAST!!

I will admit that I'm getting asked a lot more about LCD computer monitors being used as pseudo HDTV's by friends and acquaintances who will probably never buy a 'true' WS HDTV because such a purchase makes economic sense to them.

BTW, I have both a CRT and a LCD large screen HDTV... I don't see any difference when viewing interlaced or progressive material - or enough to declare one format 'dead' in comparison to the other. I'd suggest if you do see a marked difference, i.e. enough to put you off, your HDTV needs repair...

Duane Smith June 3rd, 2005 08:58 PM

Quote:

Originally Posted by Kevin Dooley
...but if any of my friend's main TV goes down, I don't know a one of them that wouldn't seriously consider spending a couple hundred more to get an HD flat panel (or at the very least the abonination that is EDTV) of some sort...

I dunno....are your friends married? ;-)

When my main TV went on the fritz last year, it was like pulling teeth to get my wife to agree on buying anything but an el-cheapo $350 TV. The fact that no one we knew had an HD flat panel didn't help. I mean sure, everyone I know *wants* one, but not many folks are actually willing to plunk down $2000+ for a TV when a $350 unit works just fine. Eventually, after much debate, my wife agreed to a $1000 budget for a new TV, and I ended up finding an open-box demo unit of a Philips 34" HDTV (CRT) that fit in my price range. But even with a $1000 budget, I couldn't find any non-CRT based HDTV unit. And believe me, nearly everyone I know freaked out at the thought of me spending $1000 on a television! (Note: you don't wanna know what they think of my camera purchase -- ha-ha!)

LCD, DLP, Plasma, SED...they all need to get so much cheaper it's not even funny. Look at the price differences; you can buy 10 CRT TVs for the price of the cheapest non-CRT unit. Or better yet, you can buy 1 CRT tv and afford to make your kids happy with food and clothing.

People in the REAL world have more important things to worry about than TV.

Steve Crisdale June 3rd, 2005 09:41 PM

Quote:

Originally Posted by Duane Smith
I dunno....are your friends married? ;-)

Either that or seperated with kids. I'm seperated, but we didn't have any kids, so I guess I can indulge this HD obsession...

[/QUOTE](Note: you don't wanna know what they think of my camera purchase -- ha-ha!)[/QUOTE]

Too true!! And they're not all poor buggers who don't earn heaps more than me!!

[/QUOTE]People in the REAL world have more important things to worry about than TV.[/QUOTE]

You mean reality isn't compressed into a 60min program that's best viewed in the sort of clarity and precision that only a true 1080p HDTV can provide? The poor deluded fools don't understand what they're missing by choosing to live life that way!! I am being sarcastic BTW... :)

Kevin Dooley June 3rd, 2005 10:29 PM

Wow, there were a lot assumptions while I was away for the evening...

Yes, like me, most of my friends are married and with kids no less. And no, I'm not talking $4K+ TV's. I'm talking about a competitive price market. The gap between CRTs and everything else is closing everyday. It's not going to take 10-15 years for 19"-30" LCD (or some other tech) WS HDTV's to get to the $350 price range. I seriously doubt it will take 5. That's why I'm saying people are considering their alternatives when they have to get a new set (and yes, most people look at it as a "have to get" when the set goes out).

Oh, and believe me, all those people out there have other things to worry about... but TV is the new opiate for the masses... TV tells people what to think, feel, do and buy... of course they're gonna want to see it on the nicest screen possible.

This is a capatilist country, don't discount the power of materialism in this country. Or for that matter the porn industry. It seems all technology these days is driven by one or the other... What a screwy world we live in...

Duane Smith June 4th, 2005 07:55 AM

LOL! Yup, you hit the nail directly on the head, Kevin!

I agree with you -- and for that matter, David, who started this thread -- that EVENTUALLY all of the CRT based tv sets will disapear, but it's not anywhere near close to happening yet. We've got a couple of years to go before it's economically feasable on the low end sets, and then a couple more before the total death of the CRT as a new product, and then many more before the installed base of users isn't totally dominated by CRTs as it is now. Now obviously I can't tell the future, but it seems highly unlikely that all of the things will occur in any timeframe quicker than 5 years.

David Kennett June 10th, 2005 02:24 PM

Thanks all for responding, and even more for taking the time to think about the process of getting this new-fangled digital video into our living rooms.

Actually, I have some even wierder thoughts. Maybe all this discussion regarding interlaced vs. progressive scanning is kinda like arguing about the merits of various types of horseshoes after the automobile appeared.

It seems to me that all this GOP, DCT, motion prediction stuff is just a bump in the road to something far more advanced. How about updating individual pixels to provide temporal resolution and spacial resolution as needed. We could then dynamically balance the needs of motion and resolution. There would then be only a maximum framerate and a maximum resolution.

How have people coped with the lower effective frame rate (30) of the HD-1, HD-10? By using lower shutter speed to lower the resolution of objects in motion.

The future will not be like the past.

Thanks again all!

Joe Carney June 10th, 2005 03:38 PM

Quote:

Originally Posted by Radek Svoboda
The camera signals Panasonic will deliver 1080p even on larger cameras. They mentioned at NAB they can use the 10 bit D5 format with MPEG4 encoder to deliver super quality 1080p camera. It's about time.

Radek

Yes, but mainly for live broadcast not editing. They are talking about h.264 I think, mpeg4 level 7 (or 10 I forget) which everyone hopes will take off and be accepted. That would make 1080p broadcast possible over existing infrastructure without giving up anything. More likely it will make more advertising filled reality based 1080i streams available. It's also when the trulyu smart TV will take off since the format allows for embedded information like web links. (Minority report, here we come).

Radek Svoboda June 11th, 2005 10:33 AM

I vote for 3D 1080p HDTV. Technology to make it without glasses is getting there. If 3D 1080p was good enough for Spy Kids 3D, it's good enough for TV set.

Radek

Douglas Spotted Eagle June 11th, 2005 12:12 PM

Quote:

Originally Posted by Radek Svoboda
I vote for 3D 1080p HDTV. Technology to make it without glasses is getting there. If 3D 1080p was good enough for Spy Kids 3D, it's good enough for TV set.

Radek

I'm totally with you. Every box of cereal can come with free 3D glasses. Sony can offer professional 3D glasses complete with medulla-enhanced subsonics for that "really-there" feeling.

Steve Crisdale June 11th, 2005 06:15 PM

Quote:

Originally Posted by Douglas Spotted Eagle
I'm totally with you. Every box of cereal can come with free 3D glasses. Sony can offer professional 3D glasses complete with medulla-enhanced subsonics for that "really-there" feeling.

Cool.... Then we can enjoy the posts discussing the merits or otherwise of 3D glasses from the different cereal manufacturers...

Douglas Spotted Eagle June 11th, 2005 06:23 PM

Quote:

Originally Posted by Steve Crisdale
Cool.... Then we can enjoy the posts discussing the merits or otherwise of 3D glasses from the different cereal manufacturers...


:-) It's more or less about as relevant as this thread to a great extent.
1080p isn't going anywhere, Thomson/Grass Valley is gambling heavily on it, so is Sony. I don't have enough inside information to know about Panny or JVC, but I suspect they won't have much good to say about that. But, Sony and Philips/Thomson/Grass Valley (whatever their name is this week) have by far the greatest number of cams, media servers, storage devices, and switchers out there. They have a lot of influence.
Further...we never know what's gonna happen. A few months back, several suggested the DTV initiative would die based on economic demands of the common man.
Well....just last week, Congress moved UP their mandate for digital-ready television by 5 months! NAB is turning handsprings. In fact, everyone except consumer watchdog groups seem to be thrilled with what is pretty much a foregone conclusion; we're gonna make the DTV deadline. Now, if we could just get everyone to buy 1080 capable displays....:-)

But I DID hear that Lucky Charms will have the best 3D glasses.

Steve Crisdale June 11th, 2005 06:48 PM

Quote:

Originally Posted by Douglas Spotted Eagle
:-) It's more or less about as relevant as this thread to a great extent.

But I DID hear that Lucky Charms will have the best 3D glasses.

And there was me thinking the 3D glasses most likely to get the desired effect would be those found in laxative type cereal packs...

Joe Carney June 11th, 2005 08:45 PM

Hmmm....Jenna Jameson in 3D? Could we handle it?


All times are GMT -6. The time now is 05:08 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network