DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   Open DV Discussion (https://www.dvinfo.net/forum/open-dv-discussion/)
-   -   HDR Video (32 bit) Discussion... (https://www.dvinfo.net/forum/open-dv-discussion/106203-hdr-video-32-bit-discussion.html)

Jay Burlage October 22nd, 2007 10:53 AM

HDR Video (32 bit) Discussion...
 
Greetings all! I'm new here and have a topic I'd like to discuss and I don't really know where it belongs... so here we go!

I've been forging what I believe is new ground by using time-lapse techniques to create true 32-bit real-world video sequences (hdrtimelapse.com). My sequences are being used to develop 32-bit codecs. As you may or may not know the first HDR display to hit the market (brightside) was snatched up by Dolby along with their developing HDR-MPEG codec. Sony recently announced their entrance into the uber contrast ratio club (linky) as well and we will see more and more coming out of the woodwork in the next few years...

So what are your thoughts on super latitude capture, encoding, editing, and display?

-I personally think this will be the ultimate death of film in capture as the technology will ultimately surpass the latitude king.

-What have you heard about cameras? I know there are a couple out there getting up to 14-bit but anyone seen anything on the horizon?

-Editing? I know we've got after effects and fusion handing radiance frames (and maybe Vegas now?)... What else is there and where do you see this whole 32-bit thing going?

I've seen some very lively discussions (way over my head as well) here on bit depth thus, my interest in joining in and seeing what the community has to say. I'm finally 'in' and ready to talk! Feel free to ask me questions as well if your scratching your head...

Jay

Emre Safak October 22nd, 2007 10:59 AM

Currently the only place for HDRI in film-making is matte painting (sometimes in panoramic form), since they are static. As you know, HDRI means intelligently integrating several sources for increased dynamic range. This is very difficult to do unless the sources overlap. I'm sure one day we will be able to take several cameras and shoot the same scene from several angles and combine them for a 3D HDRI. This is computationally infeasible for now. I would be happy to have affordable 4:4:4 2K, let alone 4K. We can worry about HDRI in ten years.

Jay Burlage October 22nd, 2007 11:23 AM

Quote:

Originally Posted by Emre Safak (Post 762753)
I would be happy to have affordable 4:4:4 2K, let alone 4K. We can worry about HDRI in ten years.

I totally agree but, can you imagine a capture and display system that could accurately replicate say 15EVs? Sometimes I wonder if it would be in a sense 'too much'... I have no idea what my work looks like on one of these emerging displays. It's hard to visualize what that experience would be like after a lifetime of 'limited latitude'.

Glenn Chan October 22nd, 2007 02:35 PM

On the viewing side, I think you're mostly limited by display technologies and viewing environment.

Theatrical projection:
AFAIK, digital projection is inferior in contrast ratio compared to film. (Though this may change if projectors get better.)

Viewing environment-wise, theatres need to light the ground so that people can see where they are walking if they need to go to the washroom or whatever. And the exit/fire signs *have* to be lit. So you will be somewhat limited by ambient light hitting the screen. If projectors were really bright, you could use a lower-gain screen and improve that... but right now projectors aren't really bright (and theatre owners may run them dimmer to extend lamp life).

Home:
Would require people with home theatres + little ambient light hitting the screen. That doesn't look like a mass market kind of thing.

On the other hand, I'd expect display technology to get higher and higher (effective) contrast ratios since it definitely improves image quality. Images don't look right without deep blacks, not that milky stuff a lot of current displays do. Right now direct-view LCD is not that bad with a bright surround (subjectively speaking).

2- On the image acquisition side, the manufacturers are definitely trying to make cameras with higher dynamic range... though if you go for high dynamic range, you make to make tradeoffs elsewhere (e.g. resolution, or artifacts). I think video sensors with 20 stops of dynamic range are possible, though the pictures aren't necessarily that good.

For normal dynamic-range output, it will be interesting to see how much dynamic range is enough. At what point do cinematographers go... ok, we have enough dynamic range. Plus, there is the ability to light a scene to fit within the camera's dynamic range (though you are limited in what you can do in day exteriors).

Glenn Chan October 22nd, 2007 02:55 PM

As far as display technology goes, there are certain things where current displays aren't ideal:

- Contrast ratio / dynamic range / blacks don't look black. Part of this is how bright the surround is.
- Low dynamic range. After you get blacks looking (pretty) black, you probably want your specular highlights to be brighter. Perhaps if we get that, then perhaps metal objects will start looking truly metallic on a display because we're getting the specular highlights right.
- Resolution; deeper blacks increases perceived sharpness, so other things affect perceived resolution. More resolution / a higher sampling rate would also allow us to be able to get rid of aliasing and ringing artifacts.
- Viewing angle (related to resolution); how much of our field of view is covered by the image
- 3-D (not widespread yet)
- Color gamut
- Image artifacts, depending on display technology. e.g. ghosting, DLP rainbow artifact, etc. etc.
- (home) compression artifacts

And perhaps a number of other factors. It will be interesting to see which areas get the most improvement / which approach is taken. Right now with DCI + digital projection, it looks like digital projection will eventually achieve higher resolution and a larger color gamut (though sony 4k projection doesn't look as good as 2k, where 2k has better blacks).

NHK is going with ultra high definition.

Various companies are pushing the 3-D route.

Imax has resolution and viewing angle.

The home theatre world is moving to lay the groundwork for wide gamut (with xvColor). Though it has to be seen how that pans out.

2- In my opinion... I think displays will move towards a higher contrast ratio but not so dramatic that we would be displaying HDR. Ambient lighting would limit the dynamic range except in special viewing environments... some sort of HDR theatre. But for that to happen, the image quality has to be noticeably better than what already exists. And I think people would rather prefer 3-D than HDR.

Technology-wise, it looks like 3-D is more feasible than HDR.
The cost and technology to make 3-D movies is going to get better. Stereo rigs are likely going to become cheaper; 2D --> 3D conversion is going to get better + cheaper (as software and computers get better); Blockbusters have a lot of CG imagery now, which lends itself to 3-D.

Giroud Francois October 22nd, 2007 04:03 PM

HDR is perfectly viewable with regular display. Most of the HDR pictures you can find on the web are displayed on your computer screen and there is no need to have special hardware to understand the benefit of HDR capture.
As well, HDR movie can be done using the same equipement used for 3D capture (basically a transparent mirror and 2 cameras).
This way you can use a camera set to low light and another to bright light.
Then you combine pictures the same as you would do for HDR photography.
a codec using 32bit could be used (the 8 additional bits not for alpha but for the additional luminosity chanel).

Jay Burlage October 22nd, 2007 04:16 PM

Quote:

Originally Posted by Glenn Chan (Post 762869)
On the viewing side, I think you're mostly limited by display technologies and viewing environment

That is a very good point... To really appreciate some mega detail in the shadows you'll have to lock yourself in a darkroom!

Jay Burlage October 22nd, 2007 04:21 PM

Quote:

Originally Posted by Glenn Chan (Post 762882)
Various companies are pushing the 3-D route.

.... And I think people would rather prefer 3-D than HDR.

Please explain? Are you referring to holographic or extensions of current CGI?

Jay Burlage October 22nd, 2007 04:29 PM

Quote:

Originally Posted by Giroud Francois (Post 762925)
HDR is perfectly viewable with regular display. Most of the HDR pictures you can find on the web are displayed on your computer screen and there is no need to have special hardware to understand the benefit of HDR capture.

But your only 'seeing 8 bits' - the light levels are compressed to that range... It's hard to imagine seeing the 'true lights and darks' displayed. We all are so accustom to the existing representative technology.

Glenn Chan October 22nd, 2007 07:09 PM

Jay, I'm referring to the current ways of 3-D projection like Real-D.
http://www.reald.com/

(There's a bunch of other companies too.)

2- Giroud, to clarify things... there's two different sides to HDR:

- HDR during image acquisition
- HDR when displaying images

The HDR images you see on normal computer screens has to been tone-mapped to fit/work withing the dynamic range of normal displays... which is not the same as a HDR display like what Brightside is making.

Marcus Marchesseault October 23rd, 2007 12:35 AM

I dreamed up a technique that would allow motion/video cameras to capture HDR without the need to invent new types of imaging chips. Does anyone know how I might go about bringing this to fruition without giving away my idea? Cameras would need to be built with this technology from the ground up, but it does not require new types of components. Should I pitch it to the RED guys?

Glenn Chan October 23rd, 2007 12:52 AM

You can patent your idea (as long as it isn't already patented). Then it's ok to tell others about it since they can't use your idea without licensing it.

*Consult a lawyer about the legal details, how this works in practice (may not be the same as theory/what the law 'should' be), etc. Certain things like circuit designs may not be patentable and thus other companies can rip your design off (look at cases involving Behringer).
**And you might want to consult the relevant type of lawyer to help file the patent, to get the wording right and not have your application be rejected.

2- Or you can keep it a trade secret... that's what Red is doing with the Mysterium sensor. (As a side note, Oakley has patents on some optics technology... so it's interesting to see the two approaches. It might be because Jim got sick of people ripping off the patents?)

This is if it's something that's not easily reverse engineered.

3- If it involves double exposures, or a different sensor arrangement with ND over a portion of the photosites.... people have already thought of that. :)

4- You might also be able to make a prototype and demonstrate it, without filing a patent (yet) and without giving away your secret. But then it's likely cheaper just to file the patent.

Giroud Francois October 23rd, 2007 09:23 AM

quote "2- Giroud, to clarify things... there's two different sides to HDR:

- HDR during image acquisition
- HDR when displaying images"

Yes i know, but for instance, if you really think how your eyes handle HDR, you would understand they are not really seeing HDR, they are just adapting to the best range as tone mapping do (and the older you are, the longer it takes).
when you enter a dark place, it takes time for your eyes to allow you to see the details in shadows, and the same when you go out to a sunny place.
As well you can easily make blind people for few second with regular 35mm projection.
just go from dark picture for few minutes to full white for half a second.
So if in real life you cannot usefully see in HDR, i do not really see the benefits to build a technology that can display something that your eyes will hardly see.
on the other way, tone mapping is great because it gives you the full experience of seeing what would be overexposed to your eyes in the same time than what would be underexposed.
usually such feature is called "augmented reality", because you do not need to trade something to get the other, you just add new stuff to what you already get. Lots of technology are going that way today.
and this requires special recording and processing , but no special technology to display.

Jay Burlage October 23rd, 2007 09:38 AM

Quote:

Originally Posted by Giroud Francois (Post 763291)
if you really think how your eyes handle HDR, you would understand they are not really seeing HDR, they are just adapting to the best range as tone mapping do (and the older you are, the longer it takes).

If I'm not mistaken your eyes 'can' take in more latitude in a moment than the current technology has to offer, your absolutely correct when the order of magnitude exceeds that range, what ever it may be, depending on the individual (and how old they may be!)....

This is exactly what I was eluding to when I asked:

Quote:

Originally Posted by Jay Burlage (Post 762760)
Sometimes I wonder if it would be in a sense 'too much'... I have no idea what my work looks like on one of these emerging displays.

Since I obviously have to tone-map it down to 8-bit to 'see what it looks like'....

One thing is for sure though in terms of the advantages of capturing all that range: The ability to work with the footage in post is amazingly flexible and that is a clear advantage even with mainstream display technology. I agree with you there!


All times are GMT -6. The time now is 02:47 PM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network