View Full Version : 1080i or 720p?


Petr Marusek
January 11th, 2006, 03:30 PM
Asia (Japan, etc.) HDTV broadcast is 1080i only; majority of US HDTV broadcast is 1080i, broadcast in Europe is 1080i only. Future HDTV sets will be 1080p and will display 1080i fine; most of the current ones are anything from 480p plasma, promoted as HDTV compatable, to interlaced CTR HDTV sets, promoted as 1080i, to true 1080p. Future Europe's broadcast will be mixture of 1080i and 720p. 1080i, 720p and 1080p is recommended for HDTV content creation in Europe, regardless of broadcast standard.

Future cinema and HDTV broadcast is likely to be 2k/1080p. 48-60 fps. 3D is more profitable and will be more comon in cinema, probably even in broadcast.

John Mercer
January 12th, 2006, 02:23 AM
Europe is not 1080i because there is no widespread HD broadcasting there yet - it will certainly be both 720p and 1080i when it happens - SKY in the UK will broadcast in both for example from mid 2006.

I believe the USA has a pretty even mix too with some broadcasters using 1080i and some like FOX using 720p.

The point is no-one is going to broadcast any time soon in 1080p. It's all to do with transmission bandwidth and satellite transponder space - in the the states MPEG 2 is used for transmission, whereas in Europe they will probably adopt MPEG 4. For multi-channel broadcasts of 400 channels and above it is unlikely and impractical that 1080p will ever become the standard.

Cinema could likely be 1080p BUT will remain at 24p, not 48-60p, but we may see the emergence of UHD of 4k. TV will use 50/60p for sport and live events in all probability and 24/25/30p for drama.

3D has never ever been really profitable. Its main problem is the physical discomfit of watching it for long periods of time especially with anaglyph spectacles. It has always been a gimmick in the cinema and will remain so. It may come into its own with videogames and scientific, architectural usage but I would not imagine it will ever gain widespread usage in cinema/TV.

Petr Marusek
January 12th, 2006, 02:36 AM
The new digital cinema standard calls for 48p; IMAX high end is 48p.

3D without glasses technology is here now and is improving quickly.

John Mercer
January 12th, 2006, 02:46 AM
Where does the 'new digital cinema standard' exist for 48p? Filmakers are obsessed with 24p and are not likely to change that. All existing 35mm production is 24fps - filmakers wish to emulate this with digital - 48p looks very different to 24p temporally - it looks significantly less film-like and this will be a concern to most filmkakers. IMAX is not digital but a horizontal 70mm celluloid format and as such is not in widespread usage but used for specialist exhibitions - there are technical reasons why 48fps is used.

"3D without glasses technology is here now and is improving quickly"

Maybe so but it is not very good, and I still maintain is a gimmick in TV and film production and unlikely to catch on in a big way.

Matthew Greene
January 12th, 2006, 02:56 AM
I remember seeing a Showscan demo when I was a kid, if I remember correctly it was large format film at 60fps, I don't remember much but a couple scenes one was the illusion of a person behind the screen and I remember buying the effect as real back then.

As far as I was told IMAX running at 48 fps was adopted mainly for two reasons. The use of 3D and because of the tremendously apparent motion judder when you've got a screen/format that large, your camera movement speeds are too limited at 24fps.

I'm afraid that cinematographers might be reluctant to embrace 48fps cinema in most cases (maybe action films?). Steven Poster says it better than I can in his ASC manual bit about frame rates of film vs. video and how the human brain interprets them.

I reserve my comments on 3D, I'm afraid that I'm attached to an IMAX 3D production.

Petr Marusek
January 12th, 2006, 02:58 AM
please remove

Petr Marusek
January 12th, 2006, 03:00 AM
Where does the 'new digital cinema standard' exist for 48p? Filmakers are obsessed with 24p and are not likely to change that. IMAX is not digital but a horizontal 70mm celluloid format and as such is not in widespread usage but used for specialist exhibitions - there are technical reasons why 48fps is used.

The theater owners association agreed on 48p (the standard is something like 50 pages long), in addition to 24p, as a future film speed for digital cinema; this was done together with the studios. Top filmmakers always wanted more than 24 fps. The video wanna be filmmakers always wanted 24p. There was 30p 70 mm and the 48 fps IMAX costs a lot more than the 24 fps IMAX.

Matthew Greene
January 12th, 2006, 03:16 AM
Just as an example of what a leading member of the ASC has to say about the frame rate topic, of course it's not written on stone and it's open to argument

I just found an online reference that quotes the article I was refering to, here's an exerpt taken directly from http://www.ifvchicago.com/aesthetics/a_article_030101.shtml

Steven Poster who is the member of American Society of Cinematographers says that "There are productions that are best done on tape and there are productions best done on film. News and sports, special events like variety shows and concerts, news-based and contemporary documentaries, industrials and educational programs are best done on tape."
"Film, however, is best for any storytelling or narrative production." (Poster, Steven, ASC). There some reasons why it is like that.
Storytelling is like a book when you read you use your imagination to see that story in your mind. According to Poster, when we watch a movie we use the imagination as opposed to the video. "Film is shot at 24 frames per second. At that speed, there is a certain amount of blur in the images. There is also a brief time between, when there is no image at all" (Steve Poster, American Cinematographer Video Manual, 350). It makes the audience use their imagination to fill that missing information, adds Poster. The film image usually force our mind to imagine even more than we see on the screen. Video is quite different. During watching the TV screen our mind does not have time for imagination like during watching the film. There is scientific explanation why.

The TV set shows 60 interlaced images per second and it has a special purpose why. "Douglas Trumbull did psychological and physiological tests on all kind of audience and determined that 60 images a second is the maximum visual information that can be transmitted through the optic nerve to the brain."( 350).No blank spots where our brain has time to imagine something besides. We can close our eyes and see nothing or we can watch the television screen with no imagination or mind interpretation. What we see on TV our brain accepts as a complete image with no room for our brain interpretation. "When we see video images we’re getting a direct implant of images; we are not having to use our imagination to fill in the blanks." (Steve Poster, ASC). Our eyes don’t see the difference of transmitting information between film and video because of the vision perception. However, we feel the difference in our brain or our brain sees the difference. Probably you have noticed that what you have just seen on TV you accepted fully as it is. Unless you turn off the TV set and think about it. The opinion that TV manipulates our thoughts is proved by science. Not only journalists have a big part in it, but also technicians.

Do you want to stay at home to watch a movie and have no conclusions or thoughts about it and for instance think that all women on any beach look like "Baywatch" actress; or rather is it better to go to the movie theater and use some of your imagination? If we want to live an image from a media as it is we should depend on TV sets, but if we want to live the image in our head with our mind interpretation it is better to watch film format.

John Mercer
January 12th, 2006, 03:17 AM
"The theater owners association agreed on 48p (the standard is something like 50 pages long), in addition to 24p, as a future film speed for digital cinema; this was done together with the studios. Top filmmakers always wanted more than 24 fps. The video wanna be filmmakers always wanted 24p. There was 30p 70 mm and the 48 fps IMAX costs a lot more than the 24 fps IMAX."

Petr, I don't know where you get your info from but if you're going to make such definitive statements then sources please.

Since when did theatre owners decide on film standards? Which studios propose 48p? You admit 48p is proposed in ADDITION to 24p. Which top filmakers have gone on record as wanting more than 24p? Are you sure they don't mean digital variable rates for slow-motion and such like rather than a projection standard? I'm not saying that 24p is the holy grail myself you understand but it is percieved to be by both industry professionals and 'video wanna be filmmakers' - that is why Cine-Alta and Varicam use 24p.

There have been a variety frame-rates used in cinema - 24fps has been THE standard since the introduction of the talkies in 1929. Todd-AO used 30fps when intoduced but quickly dropped it. IMAX is a specialist format - a 'fun-ride' format if you will. Subjects are often short, such as roller-coaster rides and the overall desire is to produce as 'life-like' an experience as possible (this was especially the case before the advent of HD) therefore so the theory goes 48fps looks more 'lifelike' than 24fps - for the very same reasons it does in video.

Petr Marusek
January 12th, 2006, 03:25 AM
"The theater owners association agreed on 48p (the standard is something like 50 pages long), in addition to 24p, as a future film speed for digital cinema" Petr, I don't know where you get your info from but if you're going to make such definitive statements then sources please. Since when did theatre owners decide on film standards?

As everyone wants to go digital to save $, and the studios are financing this digital theater transition, there is a new standard that calls for 4K/24 fps and 2K/24&48 fps. As I said, the standard is some 50 pages long. You'll just have to Google the net.

As the new digital projector theaters will have 48 fps capability, you'll see move towards 48 fps production by Cameron, Lucas, and other top filmmakers. There will still be 24 fps prints made for the non-converted theaters though. The 48 fps also converts fine to 50&60i/p for HDTV/Blu-Ray, etc.

John Mercer
January 12th, 2006, 03:31 AM
"As everyone wants to go digital to save $, and the studios are financing this digital theater transition, there is a new standard that calls for 4K/24 fps and 2K/24&48 fps. As I said, the standard is some 50 pages long. You'll just have to Google the net."

No - the burden of proof rests with you when you assert something, that's how it works - I'm not going to google it, you'll have to provide a link to not only the standard but the the top filmakers statements on wanting something other than 24p.

Matthew Greene
January 12th, 2006, 03:46 AM
Well, in all fairness if there were applications for 48fps film projection, there are applications for 48fps digital projection.

It's just my best guess that your everyday cinematographers aren't going to stop shooting the bulk of projects at 24fps just because they have a higher frame rate available. If we do have a transition to shooting and projecting 48fps as the future standard, it's going to be a slow, slow one. The only reason that that's my bet is because 99% of mainstream motion pictures are still shot and exhibited on film almost 10 years after the digital cinema hype began and that was a much more anticipated technology.

Kevin Shaw
January 15th, 2006, 09:18 AM
The only reason that that's my bet is because 99% of mainstream motion pictures are still shot and exhibited on film almost 10 years after the digital cinema hype began and that was a much more anticipated technology.

Perhaps so, but don't be too sure we aren't on the verge of a digital cinema transition. It took less than ten years from the introduction of the first affordable digital still cameras to this week's announcement that Nikon is halting production of most of their 35mm film cameras; if digital theater is taking longer it's only because it's a more technically demanding task. At some point the quality and cost-effectiveness of digital video will wipe out movie film just as surely as digital photography has largely wiped out film photography. It's just a matter of time now.

Dan Euritt
January 15th, 2006, 10:24 AM
It took less than ten years from the introduction of the first affordable digital still cameras to this week's announcement that Nikon is halting production of most of their 35mm film cameras; if digital theater is taking longer it's only because it's a more technically demanding task.

there is no logical basis for comparing digital still cameras with digital theater.

digital theater infrastructure is growing very slowly because 1)there isn't any media for it 2)it costs a whole lot of money to install a digital theatre.

neither of which is the slightest bit applicable to digital cameras.

Mathieu Ghekiere
January 15th, 2006, 10:52 AM
I love 24p, I love that kind of... blurryness (don't know if that's the exact word)...
I don't want movies to begin to look like soap operas (60p)

Mark Grant
January 15th, 2006, 12:20 PM
I remember seeing a Showscan demo when I was a kid, if I remember correctly it was large format film at 60fps

70mm at 60fps, I believe: I've heard good things about it, but never seen it.

As for 3D, the problem is that you need a huge screen to get a decent 3D effect: IMAX can do it very well (in fact I've found some of the IMAX 3D footage I've seen quite disconcerting because it seemed so real), but a normal cinema screen or TV set can't. The screen has to fill most of your field of view to get a decent effect, because no '3D' object can extend beyond the edges of the screen.

Ken Hodson
January 15th, 2006, 03:13 PM
"I love 24p, I love that kind of... blurryness (don't know if that's the exact word)...
I don't want movies to begin to look like soap operas (60p)"

They will only look "soap opera'ish" if you want them to. 24p shot with a high shutter speed has no blur, only crip clear studder/judder. Higher frame rates can use any shutter speed they want to match the blur level needed. That is the benefit of this new technology. You can create any look you like. The look of 24pfps film definately has its place with the paced non-action dramas, and a higher frame rate can emulate this, just minus the judder on the pans unless you want to add that in post ;>)
Filmaking has changed dramatically since the 24fps standard was institutionalized an incredably long time ago. Static shots were the rule. Panning is an artform where one has to fight the limits of 24fps.
Modern high action filmaking productions suffer from a limiting frame rate. The modern eye craves space battles that are not a juddery blurry mess. Lets save that for the love scenes.

Kurth Bousman
January 15th, 2006, 03:38 PM
hey guys - I believe film is projected at 48 fps , shot at 24 fps and each frame is projected twice with a shutter pass between . Our brains fill in the rest with "persistance of vision ". Kurth

Ken Hodson
January 15th, 2006, 03:41 PM
John- "I'm not saying that 24p is the holy grail myself you understand but it is percieved to be by both industry professionals and 'video wanna be filmmakers' - that is why Cine-Alta and Varicam use 24p."

The Cine-Alta and Varicam video cameras use 24p so they could shoot for film. 24fps has been the institutionalized projector format for 80 years. There was no choice in not using 24p if they were going to sell them as film capable cams. It isn't a matter that industry professionals percieved 24p as the holy grail, there never was a choice. You shoot for film or you don't

"There have been a variety frame-rates used in cinema - 24fps has been THE standard since the introduction of the talkies in 1929. Todd-AO used 30fps when intoduced but quickly dropped it."

That has nothing to do with preferred frame rate and everything to do with economics. 24fps projectors became the first true mass media standard. Higher frame rate projectors would require an industry to change all of their projectors, many of the cameras, and increase the cost in film stock, and a re-education in shooting theory. Same reason we have all had the same level of TV broadcast all of these years. You don't just up and change an industry standard.
But guess what is happening to TV? Same thing with theater projection is begining as well.

John Mercer
January 16th, 2006, 02:26 AM
"That has nothing to do with preferred frame rate and everything to do with economics. 24fps projectors became the first true mass media standard. Higher frame rate projectors would require an industry to change all of their projectors, many of the cameras, and increase the cost in film stock, and a re-education in shooting theory."

This doesn't make sense since there was no real standard before 1929 and the introduction of sound. There was a nominal 16fps, but it was mostly handcranked. So why not choose 16fps over 24fps according to your theory of economy? 24fps was considered the minimum consistent both with regards to flicker and audio fidelity of optical sound linear speed - the 're-education in shooting theory' and the requirement of "...an industry to change all of their projectors, many of the cameras..." would have been no different had they chosen 16 fps or 30fps constant velocity motorised equipment. Filmstock costs for Hollywood has always been of negligble concern compared to budget.

"The Cine-Alta and Varicam video cameras use 24p so they could shoot for film. 24fps has been the institutionalized projector format for 80 years. There was no choice in not using 24p if they were going to sell them as film capable cams. It isn't a matter that industry professionals percieved 24p as the holy grail, there never was a choice. You shoot for film or you don't"


Whilst you are correct that both Cine Alta and Varicam had to fit in with a 35mm film out workflow, it is clear that their whole designs were concieved to excite existing filmakers to the possiblities of video as a viable alternative to film, therefore 24p is offered as a creative option not just for technical film out. Whilst 24fps was never chosen in film because of some magical temporal effect, it HAS become percieved so today, and it is the Holy Grail and key for video to look like film for most industry professionals and prosummers alike.

Ken Hodson
January 16th, 2006, 07:16 PM
"So why not choose 16fps over 24fps according to your theory of economy?"

Your missing me. I was refering to your statement of why new (30fps or other) frame rate did not displace 24fps as an industry standard. The fact stands that new frame rates will not be accepted, no matter how technically supperior and artistically expanding, because every theater in the world uses one standard. With the introduction of digital projectors, this shift will finally happen. The arcaic standard of 24fps will no longer be the only option.

Kevin Shaw
January 17th, 2006, 03:37 PM
Filmstock costs for Hollywood has always been of negligble concern compared to budget.

I wouldn't be too sure about that, considering that going from 24 to 30 fps would involve a 25% increase in film costs at every stage of production, including the cost of the final print to theaters. As I understand it, 24 fps was chosen because it was the lowest frame rate which worked to get a motion look acceptable to audiences while minimizing film costs. I suppose that same argument could be applied to digital recording and distribution, but digital storage is cheaper than film stock and can be reused anywhere a permanent archive isn't needed.

Today we're seeing that progressive-scan video even at 30 fps can look "stuttery" for moving objects under some conditions, and there's a whole theory around why that's the case. All things considered, I'd rather see us move toward higher frame rates for digital video and forget about all this 24 fps nonsense. Give me 720p at 60 fps and that's something worth talking about; 1080p at 60 would be even better.

Kevin Shaw
January 17th, 2006, 04:59 PM
there is no logical basis for comparing digital still cameras with digital theater.

digital theater infrastructure is growing very slowly because 1)there isn't any media for it 2)it costs a whole lot of money to install a digital theatre.

neither of which is the slightest bit applicable to digital cameras.

Geez, we just can't seem to agree on much of anything, can we? :-)

The relevance of the comparison to digital still cameras is to show how quickly new technology can take over once it catches on and gets to a point of offering a clear practical advantage. As recently as a couple of years ago many photographers were saying they'd never go digital; today most of them are changing their tune. What happened is that it became obvious digital photography is more cost-effective, convenient, and arguably higher quality than film photography, and that it was necessary to switch in order to stay competitive.

If digital theater hasn't spread that simply shows we haven't reached a sufficiently compelling state of affairs for that application yet. But consider that starting this year we'll have a standard format for distributing HD movies using relatively affordable discs and players, and hence you *could* set up a small digital theater using a few thousand dollars worth of playback and projection equipment. Some of the theater screens at my local cineplex aren't any bigger than could be comfortably filled by a good high-end projector, and if they were any smaller I could probably use my $1000 XVGA projector to show something on them.

I have no idea what a standard movie theater film print costs these days, but I'm guessing it isn't cheap. A few of those should easily pay for the cost of a small-screen digital conversion, and a few more ought to pay for a larger screen digital setup. Didn't someone announce last year that they're going to install several thousand digital cinemas in either the U.S. or Europe? I wonder how that project is going. Either way, the point here is that once digital video technology gets to some level of practical effectivness, the transition from film to digital projection should happen fairly quickly. I'm not saying when that might occur, only that it's probably inevitable.

Ron Evans
January 17th, 2006, 06:33 PM
As John has said film at 24fps is mainly based on the economics of presenting an acceptable image to the audience within an acceptable cost for film distribution, storage, cost and size of projectors etc. Gentlemen we are stuck in the economics and technology of the last century. The limitations of the technology governed how films are made. Restrictions on motion( whether the camera or the subjects move) and created the style of closups, fixed shots, angled motion and highspeed cameras used for crash sequences etc for final slowmotion in the final print. While the final output for distribution is film these restriction will still apply. Unfortunately we penalize all the people in the home market with this inadequate technology when the equipment they use can produce nice sharp clean images since it has none of these technological or economic restrictions. I expect when the industry has moved to electronic distribution ( again for economic reasons) the film limitations of the last century will not continue to penalize everyone. We will all move to progressive HD likely at greater than 60fps. Artistic effects will still be available to create any storyline sequence needed( even emulate the old 24p look!!!!)


Ron Evans

Matt Vanecek
January 17th, 2006, 10:59 PM
digital theater infrastructure is growing very slowly because 1)there isn't any media for it 2)it costs a whole lot of money to install a digital theatre.

Just FYI, digital theatre will be prevalent within the next two years, and film theatre less so. At least, as far as Hollywood films are concerned. Carmike is going to convert all their theatres to digital by Q4 2007 (http://www.dcinematoday.com/dc/pr.aspx?newsID=369), although I've seen reports it would be completed in 2006. Sony, Dreamworks, et al, are similarly ramping up. Film cost and technology cost have finally reached a point where it is financially extremely sensible (even urgent--I think Hollywood is WAY behind on this) to replace new high-cost film equipment (or older equipment) with new digital systems with varying types of delivery (e.g., Carmike has plans for satellite delivery of content--imagine, Star Wars coming to you via inner space!!).

Money has not really been an issue, per se. From a business side, spending 100K or a million to retrofit a theatre makes complete sense, if you have a decent marketing guru. Not only do you negate the expense of film handling (and the associated personnel), but you also open up huge avenues of additional revenue by bringing in special events, etc., that will make extensive use of the digital infrastructure. From birthdays to corporate outings, etc., the digital infrastructure presents significant revenue generating opportunities outside the normal Hollywood release schedule. Mark Cuban's theatrical ventures (e.g., simultaneous screen, DVD, and PPV releases) yield even more intriguing marketing possibilities. Just, everyone's been afraid of the "risks" involved (risk mitigation is HUUUGGEEE in any corporation of size) with going digital up to recently.

Media has never been an issue. Hollywood has been digitizing film for decades, and nowadays routinely moves movies from film to computer video and back to film. The movie "Oh Brother, Where Art Thou", shot on film, was almost entirely processed on a computer, and burned out to film again (according to the DVD "making of" featurette). And then, of course, it was digitized again to produce the DVD and VHS (were there any tapes?).

Personally, I'm still not sure I like digital cameras over film (I CERTAINLY hate the shutter delays in the consumer models!). I still think film just responds better to light, if you have the correct film for the shoot (or even if not, in many cases)--speaking for stills.

I really really like the HD signals, when I'm watching a golf tournament (at a friend's house--ain't shelled out for one myself yet), or a football game--so the more res, the better. I haven't had opportunity to watch feature films on such a set yet, though. In any case, 1080i is a marketing point for many sets. I guess the question becomes, are the 1080i sets also able to process lower resolutions such as 720p nicely? (I think they are, but I've not studied it. Just watched SD shows on the occasional HD TV, and they didn't look *too* horrible). Personally, I'd rather have the highest resolution available, and downconvert as needed. Eventually, the way things move, my highest available will become middle of the road, and then obsolete. I mean, what good's and HD camera gonna be when we're all watching holograms?

hmm, think I eventually wandered back sorta to the original topic....

I'll miss the days of looking for Mo Henry in the credits...

ciao,
Matt

Matthew Greene
January 18th, 2006, 12:25 AM
Digital projection is a hot topic, however, the roadblock is that the majority of theatre owners don't want to make the investment and the studios/distributors don't want to pay for it. Nowadays, movie theatres don't pay for the prints and with razor thin profits many of them would go immediately bankrupt if they had to invest in digital projection systems for all their theatres at once. In the meanwhile, the handful of theatres that do have digital projection systems installed aren't getting proper support from distributors, films aren't available and there's too many formats involved. Studios also have concerns about the cost/trouble of digitizing every film they release at 2K resolution and are also concerned about potential piracy from their 2K resolution releases. They're going to have to figure all these issues out.

While practically all Hollywood features are shot on 35mm, the trend to scan the film and do all color correction and finishing digitally (a Digital Intermediate) is growing, Oh Brother, Where Art Thou? was indeed one of the first to take this approach. It's a good approach and few DPs can argue with it's benefits (although some still prefer the optical route).

Hollywood has the final word on what filmmaking systems make it and which don't. Hollywood is all in support of DIs and split on digital delivery/projection. DPs aren't moving towards digital adquisition though, the adopters of this technology other than George Lucas and Robert Rodriguez have been independent filmmakers for the most part. Hollywood DPs aren't likely to give up film anytime soon.

Matt, you said "Hollywood has been digitizing film for decades". While it may be a true statement depending on iterpretation, digitizing complete features at near film resolution is a pretty recent innovation, it's still very unpractical to work with full films at 4K resolution though.

Regarding the frame rate issue, I've never heard a film DP say I wish we weren't shooting at 24fps. In fact, if you look at primetime series that are shot in HD, they're all 24P, sure they could of shot at 60i but in the context of a dramatic piece, 60i would ruin a certain asthetic offered by 24fps capture and all of the sudden would start looking like a soap opera. For the most part, directors of photography love the 24fps medium and wouldn't trade it for 60i if you put a gun to their heads.

Someone mentioned that to achieve the motion blur of 24p in 60fps capture all you had to do is modify the shutter speed... that's not possible. At 60i the lowest shutter setting you can achieve while still having 60 independent images is 1/60. Sure a camera might have a 1/8 or 1/15 setting but you'll have a strobe effect to your image if you're recording at 60i. The only way to achieve true 1/48 sec shutter is by shooting at 24fps or slower.

Edit> I had an afterthought/comment. I'm prepared to be pretty dissapointed when Digital Projection becomes common. think about it, anyone that's worked with projection systems in a cinematic presentation for a period of time knows that they have to be tweaked constantly as the lamp/device ages if you want to maintain quality. Right now Theatres that have Digital Projection have consultants working along with them but can you imagine what it's going to be like when it becomes a widespread technology, you're going to have $7/hr teenagers that don't know anything about engineering tweaking and adjusting these things, I mean for god's sake, they manage to mess up the focus and framing of film projectors more often than not... Unless technology changes dramatically or theatres start paying engineering salaries you're going to have kids tweaking 100+ delicate settings on these devices. Someone is going to have to come up with a good way to set and maintain these devices or be ready for images that'll make the DP of that film cry.

Ken Hodson
January 18th, 2006, 03:10 AM
"Regarding the frame rate issue, I've never heard a film DP say I wish we weren't shooting at 24fps. In fact, if you look at primetime series that are shot in HD, they're all 24P, sure they could of shot at 60i but in the context of a dramatic piece, 60i would ruin a certain asthetic offered by 24fps capture and all of the sudden would start looking like a soap opera. For the most part, directors of photography love the 24fps medium and wouldn't trade it for 60i if you put a gun to their heads."

Alright lets loose the 60i talk. No one mentioned interlaced. We are talking digital projection at 48fps or 60fps or 48p/60p capture.

"Someone mentioned that to achieve the motion blur of 24p in 60fps capture all you had to do is modify the shutter speed... that's not possible. At 60i the lowest shutter setting you can achieve while still having 60 independent images is 1/60. Sure a camera might have a 1/8 or 1/15 setting but you'll have a strobe effect to your image if you're recording at 60i. The only way to achieve true 1/48 sec shutter is by shooting at 24fps or slower."

My bad. At 60p capture I believe a shutter of 1/120th would give the correct 180 degree shutter angle to match. Correct? But considering the massive shift to elaborate digital post, motion blur can be done completely in post to exactly the levels required. For that matter so can depth of focus, but it is far more involved than adding motion blur. We are definately entering a new era of filmaking.

"I had an afterthought/comment. I'm prepared to be pretty dissapointed when Digital Projection becomes common. think about it, anyone that's worked with projection systems in a cinematic presentation for a period of time knows that they have to be tweaked constantly as the lamp/device ages if you want to maintain quality. Right now Theatres that have Digital Projection have consultants working along with them but can you imagine what it's going to be like when it becomes a widespread technology, you're going to have $7/hr teenagers that don't know anything about engineering tweaking and adjusting these things, I mean for god's sake, they manage to mess up the focus and framing of film projectors more often than not... Unless technology changes dramatically or theatres start paying engineering salaries you're going to have kids tweaking 100+ delicate settings on these devices. Someone is going to have to come up with a good way to set and maintain these devices or be ready for images that'll make the DP of that film cry."

Good point, but I think you over estimate how well this calibration has been kept up. George Lucas had testing done of theaters befor the release of The Phantom Menace, and found that a large number of theaters, including the major chains were dramatically underlighting projection to save money. He made a public statement that any theater found underlighting his film would have it pulled. I never heard any more of it after that. The digital projectors will probably have an automatic replace bulb flashing light when a change is needed. If the $7/hr worker can read it shouldn't be a problem, unless his supervisor tells him to put a piece of tape over the flashing light and ignore it ;>)

Matthew Greene
January 18th, 2006, 03:38 AM
Alright lets loose the 60i talk. No one mentioned interlaced. We are talking digital projection at 48fps or 60fps or 48p/60p capture.;

Ok, point taken, replace the i for an p, still the same in terms of motion, although not an actual production standard at the moment.

...At 60p capture I believe a shutter of 1/120th would give the correct 180 degree shutter angle to match. Correct?

Yeah, it will give you the equivalent of a film camera with an 180 degree shutter running at 60FPS but the only way to get the filmic motion blur of 24 is to have the shutter open for 1/48 of a second. A 1/120 shutter is roughly equivalent to a 70 degree shutter in a film camera running at 24fps.

...But considering the massive shift to elaborate digital post, motion blur can be done completely in post to exactly the levels required. For that matter so can depth of focus, but it is far more involved than adding motion blur. We are definately entering a new era of filmaking.

You can fake motion blur but it looks arguably fake and takes render intensive tracking of every pixel from frame to frame, nothing replaces the actual capture of the blurred movement that isn't there with a higher shutter speed. Realisticly, It's definitely not something you'd want to try and do for a whole film.

Good point, but I think you over estimate how well this calibration has been kept up. George Lucas had testing done of theaters befor the release of The Phantom Menace, and found that a large number of theaters, including the major chains were dramatically underlighting projection to save money. He made a public statement that any theater found underlighting his film would have it pulled. I never heard any more of it after that. The digital projectors will probably have an automatic replace bulb flashing light when a change is needed. If the $7/hr worker can read it shouldn't be a problem, unless his supervisor tells him to put a piece of tape over the flashing light and ignore it ;>)

Don't forget that for SW, every theatre had consultants that were babysitting the devices on an almost daily basis. And George was pressuring because his reputation was on the line. Take that handful of theatres and now consider every theatre in the country or world having a DP system. Most projectors let you know when lamp life is past a certain point but even if you tweak the image at hour 1, by hour 100 you have a different looking image, it's not a matter of replacing lamps, it's a matter of maintaining quality which will never be achieved unless there's some sort of auto setup system for projectors developed.

Ken Hodson
January 18th, 2006, 04:29 AM
"You can fake motion blur but it looks arguably fake and takes render intensive tracking of every pixel from frame to frame, nothing replaces the actual capture of the blurred movement that isn't there with a higher shutter speed. Realisticly, It's definitely not something you'd want to try and do for a whole film."

I think you vastly under estimate the power and quality and speed of modern post tools. In heavily composited shots, this is the norm now. And of course it all depends what level of production we are talking about. An idie HD feature for release on DVD you would be very hard pressed to tell, where a filmout might be more tell tale. An uber budget FX Hollywood with the best software?

"Don't forget that for SW, every theatre had consultants that were babysitting the devices on an almost daily basis."

The theaters were found to be using under powered bulbs on purpose to save money. This was/is an industry wide practice, and only stopped briefly under threat from Lucas for his picture. I only brought it up as the new digital projectors are not going to have any more negative impact then the current system which is corrupt and far from perfect, regardless how much the projectionist gets paid.
Being that the shift to DP is going to happen no matter what I belive they will incorporate some form of auto calibration.

Matthew Greene
January 18th, 2006, 04:53 AM
I think you vastly under estimate the power and quality and speed of modern post tools. In heavily composited shots, this is the norm now...

I know what the confusion is here, in composite shots or CGI work, the motion blur is easy to calculate and generate since you're making that move in the software. The problem arises when you already have something shot with a high speed shutter and bring it into the computer, what needs to happen there is that the computer needs to look around, motion track, analyze and calculate the movement from frame to frame of everything in the image, if the difference is too big and the computer gets lost it needs human assistance, it's a painstaking and slow process and is not worth not getting right it in camera in the first place.

One afterthought I had was that if you tried to apply the blur of a 1/48 sequence to a 1/60 sequence, the blur would anticipate the movement on some frames.

...I only brought it up as the new digital projectors are not going to have any more negative impact then the current system which is corrupt and far from perfect, regardless how much the projectionist gets paid.
Being that the shift to DP is going to happen no matter what I belive they will incorporate some form of auto calibration.

I think most people would agree on this, I just see the switch to DP as a potential deterioration of the thearical experience unless quality standards are strictly maintained. As good as DP is, there's also a huge potential for ruining the image with improper image settings, which are not "set and forget", they drift pretty quickly. There's much more variables that affect picture quality in the setup of a DP with a hundred image settings than on a 35mm projector with just a couple of settings.

If an autosetup system is going to be created, and none like this exists that I know of, it's going to have to be some sort of intelligent image analysis camera/computer system that can measure all aspects of image calibration and control the projector for setup. I'm not exactly sure how they'd pull it off but it's our best bet in order to see a film projected at the same settings that the DP intended us to see. It's that or kids running the lamps at economy mode, and if that's going to be the norm I'll run to the nearest theatre with a film print.

David Heath
January 18th, 2006, 05:30 AM
As John has said film at 24fps is mainly based on the economics of presenting an acceptable image to the audience within an acceptable cost for film distribution, storage, cost and size of projectors etc. Gentlemen we are stuck in the economics and technology of the last century. The limitations of the technology governed how films are made.
I fully agree - I don't believe there is anything 'magical' about 24fps either, rather the result of historical compromises. Before the 'talkies' arrived the standard was 16fps, projected with a three bladed shutter, and I believe this was not because it was considered optimum, but the minimum (for cost reasons) that could be got away with before judder became totally objectionable. The move to 24fps came about mostly because of sound, and the necessity for the film to move faster linearly for acceptable reproduction. Since sound meant new equipment anyway, a new fps standard didn't cause too many problems.

Not many years ago I would have cried at the thought of digital projection replacing film in the cinema. A visit to IBC 15 months ago with the d-cinema demonstration, followed by a trip to my local cinema changed that. It left me wondering why film-makers weren't clamouring for the 'HDTV look', rather than the other way round! For some information about what's happening in the UK, there's this report - http://news.bbc.co.uk/1/hi/technology/4297865.stm - albeit now nearly a year old.

My understanding of European plans for HD are for broadcasting systems to allow for either 720p or 1080i transmission, but for 1080 to be far more common. (Much drama is currently made 1080p/25, to be transmitted 1080i/50.) The goal is ultimately seen as 1080p/50 for production at least, though transmission in that standard is likely to take a lot longer owing to bandwidth limitations. In the meantime 1080i is seen as the most likely way forward, sport etc produced as such, drama produced 25p and transmitted segmented field. The expectation is that manufacturers will move toward 1080 native resolution for screens.

Don Donatello
January 18th, 2006, 01:40 PM
for hollywood actual film cost are very low when compared to the budget so a increase of 35% for film is very little in a 50mil + budget... if it sells more tickets they will do it ...

hollywood wants digital projection ..and i believe they see it as it lower piracy. right now prints are taken out the back door at night and transferred to tape ... with digital delivery they could track copies very easy ...

Matthew Greene
January 18th, 2006, 01:47 PM
Camera Negative is a small part of the budget but the part of the chain where film is the biggest cost is in distribution, Imagine the cost of producing and delivering 2500-3000 feature length prints for a large nationwide release. It's usually in the tens of millions.

Kristian Indrehus
January 18th, 2006, 03:53 PM
Camera Negative is a small part of the budget but the part of the chain where film is the biggest cost is in distribution, Imagine the cost of producing and delivering 2500-3000 feature length prints for a large nationwide release. It's usually in the tens of millions.
That´s one reason why cinema´s want to go digital. There are never enough copy´s, because they cost too much. So they can only keep their copy for a set period of time. The prints are circulated to the different cinemas. The small ones gets them later when the copy´s are worn out. In the digital world all big releases open the same day at every cinema all over the world, and is taken off when the crowd doesn´t show up anymore.

Add: And did you know they make copy´s out of copy´s. In countries like mine where they add subtitles, you might end up looking at a scratchy copy of a copy with sharp subtitles and unfocused picture.

Go digital !