View Full Version : s-log and new Sony x90


Donald McPherson
September 30th, 2017, 01:21 PM
With the new X90 having s-log is it that much better than a camera set up with a good picture profile assuming that many are crap at colour grading?

Doug Jensen
October 10th, 2017, 09:01 PM
There is no doubt in my mind that S-LOG combined with proper grading in Resolve is superior to anything that can be done with the normal REC709 gammas and typical paint menu settings.

Cliff Totten
October 11th, 2017, 06:36 AM
Yup. No doubt.

SLOG-2 maps each sensor's full dynamic range from top to bottom. On that sensor, you are looking at close to 11stops-ish. Standard rec709-type profiles will chop that dynamic range and clip you for 6-7 stops output. (To maintain standard rec709 contrast levels, it throws out DR that the sensor sees)

So yeah, SLOG-2 custom maps all the photons those little pixel wells can hold, from empty to full, right up to 109 IRE.

Yes, you will get the maximum dynamic range that sensor can hold....but it wont be "free". To get that benefit, you will be forced to pay a possible 8bit banding penalty. (Its a scene dependant issue like blue sky)

Some shots "could" suffer gradient banding when you normalize it to rec709. Its not "garanteed" but it is certainly an elevated "possibily".

SLOG-3?....avoid it like the plague. Its way too flat for 8bit, maxes out at around 90 IRE and is brutal on 8bit. It doesn't even use the full amount of 8bit 0-255 values! SLOG-2 already maps the sensors full values to 8bit anyway so there is no need to use SLOG-3 with a further limited 8bit value mapping. SLOG-3 is not tailor mapped to each sensor model the way SLOG-2 is. SLOG-3 uses a fixed-set Cineon-type gamma curve. Its like a 16 stop curve? Whereas SLOG-2's curve is based on each sensor model's total and REAL output ability. It's not a good thing to dump an 10-11stop sensor into a huge 16 stop Cineon/SLOG-3 curve.....with only 8 bits of shade/gradient depth!

Remember, 10bit is where we'd all like to really use SLOG-2/3 as this has a color palette that somewhere near 40 times more than 8bit.

Use ProRes if you can. Even in 8bit, you will get more encoding deeper into the shadows and get less artifacts in grading than MPEG Long GOP. H.264 will sacrifice the bandwidth allocation to the shadows first when it is stressed. ProRes doesn't do this.

Hope this is not "TMI"...lol

CT

Cliff Totten
October 11th, 2017, 06:57 AM
Duplicated post

Craig Seeman
October 11th, 2017, 03:06 PM
On the z90, HD is 10 bit 4:2:2 so SLog-2 should be very nice there.

Robert Young
October 11th, 2017, 10:34 PM
Looking at the specs- it looks like only SLOG-3 is available on the Z90
Or am I missing something

Cliff Totten
October 12th, 2017, 08:30 AM
Just found that the AX700 has:

PICTURE PROFILE
Yes (Off/PP1-PP10) Parameters: Black level, Gamma (Standard, Still, Cine1-4, ITU709, ITU709(800%), S-Log2, S-Log3, HLG, HLG1-3), Black Gamma, Knee, Color Mode (Standard, Still, Cinema, Pro, ITU709 Matrix, B/W, S-Gamut, S-Gamut3.Cine, S-Gamut3), Saturation, Color Phase, Color Depth, Color Correction, WB Shift, Detail, Copy, Reset

So I'm certain that the X90 (woops!...Z90) will have the same SLOG-2 also.

Whew!,...you scared me there for a second as I have the NXCAM NX80 on pre order. (I just sold my X70. I dont need SDI as I shoot mostly 4k for everything now and dont need 10bit 1080p anymore...I still have a Z150 for that anyway)

CT

Doug Jensen
October 12th, 2017, 12:03 PM
I have a demo Z90 in my hands and it definitely has S-LOG2 and S-LOG3.

BTW, Cliff, it is the Z90 not the X90. I keep making the same mistake! :-)

Cliff Totten
October 12th, 2017, 12:37 PM
Haha,...good catch! ;-)

You gotta Z90? Spill your gutts,...what do you think?

Are you under NDA?

Doug Jensen
October 12th, 2017, 04:57 PM
All I can say right now is that so far I like the Z90 very, very much. What a great picture! I think it looks better than the Z150 and almost as good as the FS5. I might be able to post some sample footage in a few days.

Cliff Totten
October 12th, 2017, 07:25 PM
Doug, Im really ininterested in your opinion of that crazy-amazing PDAF. (Phase Detection Auto Focus)

I know you are not a fan of AF....I never was either. But this Sony technology is beginning to change my mind on the topic. Its sort of shocking really.

Really hope you run that feature through the ringer.

I love manual rings on true geared lenses with hard stops. Pulling focus on fly by wire, endlessly rotating and movement buffered lenses is a pain. So this PDAF "can" make shooting with that camera more tolerable for certain fast moving objects. Plus that object tracking technology? Damn...

But you tell me, you got one, I dont. Lol

Doug Jensen
October 12th, 2017, 09:10 PM
Oh, I assure you I'll be taking a very close look at the AF capabilities of this camera. I have already braced myself for the potential blasphemy of embracing it. This could be the one to break the rules, but I have not turned it on yet. But with that said, I have already noticed the the peaking is much better than expected and the camera as a whole is much easier to operate that I would have expected for a small handycam. I am not a fan of that form factor at all, but this one is winning me over already.

Paul Anderegg
October 13th, 2017, 03:26 AM
Doug, with the weird "spin the wheel 50 times" focus control of the X70, perhaps a "push to AF" would be a good custom button to add tot he Z90 :-P

Paul

Cliff Totten
October 13th, 2017, 07:57 AM
So I'll share this rumor that has no evidence behind it:

About two-three years ago, I read that Sony and Canon cross-traded patents together. Sony wanted Canon's "dual pixel" AF patent (based on signal phase cancelation detection) and Canon needed Sony's EXMORs patents for their on-sensor column A/D conversion process.

"If" this happened, this allowed Sony to pursue signal phase cancelation processing without the fear of being sued by Canon. Not long after that, Sony began making HUGE strides in AF using the phase detection process. Coincidence? Maybe, maybe not!

Today, many in the industry are comparing Sony's PDAF to be on par with Canon's fantastic dual pixel AF. Is this because of the patent trade??? If so, we should begin to see Canon have a big jump in signal to noise ratio on their future sensors, which should lead to better dynamic range...something that has hurt Canon models in the past.

Anyhoo, this PDAF technology started on Sony Alphas models and I'm glad to see it trickling "up" the Sony price ladder into the pro models. Why does it seem that XDCAM has always gotten everything "behind" Alpha???

Sony's old contrast detection systems are done. From now on, we can expect every future Sony camcorder to have his PDAF. It's already incredibly wicked in the short time that Sony has been developing it. It's only going to get more and more incredible as time goes on.

CT

Paul Anderegg
October 13th, 2017, 12:35 PM
All I can say right now is that so far I like the Z90 very, very much. What a great picture! I think it looks better than the Z150 and almost as good as the FS5. I might be able to post some sample footage in a few days.

When you say you think it looks better than the Z150, do you mean with the HDR and log stuff, or just cleaner and sharper and such?

Paul

Doug Jensen
October 13th, 2017, 12:58 PM
Cleaner and sharper and such for regular WYSIWYG REC709 shooting. I have not ventured into S-LOG on the Z90 yet and HDR hold no interest for me at all at this time.

Paul Anderegg
October 13th, 2017, 01:04 PM
Will be very interesting to see how the built in SLOG2 LUT's in FCPX work with these new features.


Paul

Cliff Totten
October 13th, 2017, 01:34 PM
I strongly suspect that all three of these new cameras use the same RX10-III, RX10-IV, RX100-V and Z150
image sensor. Its the latest stacked RS BSI with super fast readout. Who knows if they changed the processing since the Z150. Maybe they made even further noise reduction improvements over the Z150?

This really begs the question; "Just how much can you squeeze out of a 1inch-type sensor"? Whatever that is, Sony seems to be using and processing every darn photon they can get!

Doug, if you can, try to measure the dynamic range you are seeing. (if you have charts)

CT

Doug Jensen
October 13th, 2017, 05:09 PM
Sorry, I don't have any DR charts. Just the normal Chroma duMonde charts and stuff like that.

Paul Anderegg
October 13th, 2017, 11:58 PM
:-D

How I made my own Dynamic Range Test Chart - www.similaar.com (http://www.similaar.com/foto/flaat-picture-styles/creating-my-own-dynamic-range-test-chart.html)

Doug Jensen
October 16th, 2017, 04:35 AM
Hey Paul, I appreciate the ingenuity it took to create your own chart but DR is not something I'm interested in measuring. The camera is whatever the camera is.

A DR chart might answer someones's questions about their camera's capabilities or help them make a purchase decision, but it won't help me create a PP or provide any usable information about the camera that I can't see on my regular WFM and monitors. If you told me the camera has 8 stops, 10, stops, or 20 stops of dynamic range it wouldn't make any difference to me at all in the operation of the camera. All that matters is squeezing as much performance out of the camera as I can with whatever limitations it has.

Jay Allen
November 23rd, 2017, 08:44 AM
Doug,
Now that vimeo supports hdr, have you changed your position on using it?

Paul Anderegg
November 23rd, 2017, 09:15 AM
I wonder how HDR would be viewable by a Vimeo viewer? My Samsung TV has not yet received the capability of viewing YouTube HDR, and just in the last few months got Netflix HDR capability added...my Macbook and my home PC's can't spit out HDR to my TV either.

Paul

Doug Jensen
November 23rd, 2017, 03:54 PM
Doug,
Now that vimeo supports hdr, have you changed your position on using it?

Absolutely not. At this time, in December 2017, I have zero interest in HDR or HLG as a producer, content provider, or even as consumer. I do not own and HDR capable television. I don't know anyone who does. I don't have any friends or family who would even know what HDR or HLG is if I asked them. I am not aware of any television shows or streaming content (that I would watch anyway) that is being delivered in HDR. Not a single client has asked about HDR. So I don't see how adopting an HDR workflow would add a single dime to my bottom line, and after all, that is what it is all about at the end of the day.

Cliff Totten
November 24th, 2017, 09:50 AM
It is kinda funny how things evolve Doug. When "HD" came out, many of us said almost all the things you said above....."why?" or "what for?" After all, when it first came out, no customer was asking me for it and nobody cared about "HD". Eventually, 720 and 1080 took over anyway.

"4k?....what for?....why?...nobody is asking for it. 1080 is all anybody needs" Today, 4k is creeping deeper and deeper into our lives. I stopped shooting 1080 about two years ago. I shoot 4k even for clients that only ask for 1080 delivery.

High Dynamic Range?...nobody wants it, nah...what for?

I think HDR is not a fad like 3D was. What killed 3D was clumbsy glasses and headaches, ghosting images and general ugliness like that. (Although 3D has been arround and lasted since I was a kid)

I have seen some spectacular HDR demonstrations at InfoComm and NAB in the last year that are truely spectacular....I mean mind blowing. Its one thing to shoot 14stops of dynamic range and grade it down to 6 or 7 stops for a rec709 projection or display.....Its another thing to SEE those ALL of 14 stops that you captured on a true HDR presentation. It has a very high "WOW" factor and I dont think this one is going to die.

Doug Jensen
November 24th, 2017, 11:09 AM
Hi Cliff,

It is not the same thing at all as making the switch from SD to HD . . . or now HD to 4K. When we changed resolutions, nothing else really changed about how we shoot, light, framed, composed, exposed, etc. All we got was more resolution and it is/was very easy to dumb 4K to HD or HD to SD or whatever you want. Shooting in a higher resolution didn't compromise our ability to deliver a lower-resolution product, in fact, it could be argued that shooting at a higher resolution actually provided a better low-resolution product than if acquisition had been done at the lower resolution.

Can you say the same thing about shooting with S-LOG and/or HLG on the Z90? Are you certain that 8-bit S-LOG or HLG is going to provide you with a finished REC709 deliverable that looks better than if you had shot the same footage with a nice conventional Picture Profile? Are you prepared to say that is possible? Because I've used the camera and I'm not going to say that at this time about S-LOG and certainly not HLG.

I am not afraid of new technology or change. I was an early adopter of HD and made the switch in 2005 and never shot SD again. I made the switch to 4K in 2013 and I only shoot HD about 10% of the time. If clients didn't see the benefits of shooting at a higher resolution, I educate them until they say yes. But what is that argument today for HDR? What would I possibly tell an average client about HDR that is going to make them care? I know there are some clients who would benefit when their video will be displayed at trade shows, meetings, theaters, etc. where they provide the screens and viewing experiences. But that is not typical.

Technically I am already shooting in HDR today because I only shoot 10-bit S-LOG or 16-bit RAW with my F55 aand FS7. Both S-LOG and RAW can be graded for HDR at any time in the future that I deem it to be desirable. So that is good. But HDR is not the reason I am shooting S-LOG/RAW today. I am shooting S-LOG/RAW today because it gives me superior results today for REC709 delivery -- which is what pays the bills and puts food on the table today. Any benefit later on down the road for re-grading as HDR would just be icing on the cake.

And I may decide to shoot S-LOG with the Z90 for the exact same reasons. But I would not use HLG right now because I have no need for HLG at this time and it definitely cripples the high-end SDR delivery that I need today. I need to satisfy today's needs today before worrying about future-proofing my footage, if that future proofing compromised my needs for today. I have no doubts that HLG is not a good idea for post. HLG is a mode on the camera for live broadcasting for feeds to HLG equipped equipment. And I don't do either of those things. So for me, HLG is dead on delivery.

Before someone shoots HLG today, I suggest that they seriously ask themselves why? HLG and HDR might be the greatest things you have ever seen at a booth in a trade show, but how does that translate down to your use of a camera like the Z90 today in your own environment? What the hell is someone going to actually do with HDR today? What is it that a Z90 owner is going to do with HLG other than look at it on their home TV or put some samples online just for the heck of it? Seriously, forget about whether it looks better or not, what is the actual argument for shooting in HLG today? I'd love to hear it.

BTW, I make some extra income from stock footage (over $3K/mo.) and 95% of my sales are for the HD versions of the 4K clips that I upload. So if 4K isn't even in very high demand yet, I find it very hard to believe there is going to be a sudden surge in demand for HDR. Mind-blowing demonstrations at trade shows and great demo footage in a showroom at Best Buy have very little to do with the average joe on the street earning a living from video production.

Cliff Totten
November 24th, 2017, 12:25 PM
I actually dont disagree with just about anything that you said. I think its pretty on-point with what is going on "today". I think that HLG is 5 years ahead of itself right now. It has very little use at the moment. However, HDR screens are slowly getting out there and YouTube is working on metadata flags to support it better.

But yeah...."today" it almost worthless.

The biggest problem is Sony's own implementation of HLG on many of its cameras. HLG is NOT supposed to be in 8bit color sampling. Even on the GH5, it FORCES you to select its 10bit CODECS to even be able activate HLG.

Sony insists that we shoot 4k HLG like its SLog...in 8bit. This is not what the BBC and NHK created HLG to be used in. Even YouTube changes its internet compression to 10bit on HLG flagged uploads. Yup....YouTube is doing 10bit compression for HLG delivery to homes!!

So, yeah...we all must agree that Sony is doing HLG "half-assed" to be totatly honest.

I also agree that Slog is all the "HDR" you need to capture and if your camera has Slog, just stick with Slog. HLG is "half-log" and Slog is "full log". Even if you are locked down to a crappy 8bit. (Yeah....we all know how "risky" Slog is in 8 bit)

Anyhoo....Sony, if you are serious about "HLG" than at least give it the proper 10bit sampling it deserves and was intended for.
I think that camera manufacturers will need to give up protecting 10bit for their upper models and begin offering it in lower models. Especially now that we are beggining to see consumer TVs being able to show 10bit color and HDR.

Consumer TV growth with 10bit color and HLG conversion will drive the demand on cameras. As HLG TVs get more and more popular, cameras and camera men will follow.

I think that rec709 will die very slowly. Its old now and it dates back to a time when 6 or 7 stops of caputure and display was the best a consumer could expect. Today, we can capture and display allot more than 6 or 7 stops.

Do I think that High Dynamic Range, Dolby Atmos HLG, rec2020/BT2020 and other standards have a bright future? Hell yes......why? Because technology is making it easier and more common.

Pratically every TV in 5 years will be HDR ready. Who out there will say in 5 years..."Yea...HDR is everywhere but im still a "rec709, 6 stop delivery guy"

CT

Mark Watson
November 25th, 2017, 08:14 AM
For anybody else trying to follow along with this HDR stuff....

Found some info on a Canon site that explains what HDR is all about. I found it highly readable and now see that this HDR, to be done correctly, is much more than flipping a switch on the camera or TV. Hopefully the "standards" that are mentioned have gained enough traction that we won't see a bunch of the early HDR TVs and cameras become instantly obsoleted (BluRay war).

http://learn.usa.canon.com/app/pdfs/white_papers/White_Paper_HighDynamicRange.pdf?cm_mmc=NW-_-C300-Mark-II-_-White-Papers-_-HDR

http://learn.usa.canon.com/app/pdfs/white_papers/White_Paper_OnsetHDR.pdf?cm_mmc=NW-_-C300-Mark-II-_-White-Papers-_-HDR-WCG

Paul Anderegg
November 25th, 2017, 08:51 AM
I would think that keeping an eye on what 4K SPORT broadcasters are doing as far as HDR would help to see where the HLG thing is going...as far as live HDR, you would expect to see it first applied to live sports coverage.

Paul

Cliff Totten
November 25th, 2017, 12:50 PM
True....

Weather we like it or not. Most TV's in 5 years will all didplay 10-13+ stops of dynamic range. Even future OLED tablets and laptops will do this too.

Does anybody at all out here think that we will all be delivering in 6 stops of an "ancient" rec709? I think we ALL agree this will be the case.

The only "real" question is which 10bit consumer gamma standard will be the actual winner in this HDR race.

Sorry Sony....I love you to death but your 8bit HLG will be a total joke in 3 years. Nobody will capture HLG in 8bit and upload that for YouTube's 10bit encoding to be played on common 10bit HDR displays.

Unfortunately the Z90 will really be a rec709 camera. It will never be a real HDR camera for HDR delivery.

Sony,....please dont screw up and make the new A7S-III or A9S an 8bit camera. It will be DOA if you cripple it to 8bit only.

CT

Paul Anderegg
November 25th, 2017, 12:56 PM
Cliff, unfortunately, things like megapixels seem to be more important...while camera sensor density went up, for the most part, light sensitivity and signal to noise ratio went down, and 8 bit 4:2:0 still dominates 4K land...I see 8K coming to most devices before we see real commonplace 10 bit HDR in the mainstream for devices we get off the shelf at reasonable prices. And while compression has made great strides with HEVC, we now live in a world where data is becoming more expensive to wirelessly access, so all those extra bits that make things better typically get left behind, for good ol fashioned megapixel superiority.

Paul

Cliff Totten
November 25th, 2017, 02:22 PM
I honestly dont see 8k going anywhere but Hollywood. There is no significant amount 8k content, there are no consumer 8k displays and the worst thing....8k very hard to diferentiate from "good" oversampled 4k.

I think the law of dimishing returns kicks in after 4k. There just is very little benefit on the delivery end of 8k. Yes, for Hollywood film makers, there is an advantage of "crop-in" power that 8k offers...but again thats only usefull in a 4k delivery.

High dynamic range is easily noticable and very striking...even when you see it on display at Best Buy.

Im certain HDR will be much more common in 5 years but 8k wont. It will exist in "exotic", very large presentations.

Donald McPherson
November 26th, 2017, 03:37 AM
"8k" think of the reframing. Many of us hobbyists have only recently jumped on the 4k market so I will be sticking to my x70 for a few years yet.

Noa Put
November 26th, 2017, 04:07 AM
Im certain HDR will be much more common in 5 years but 8k wont.

Common for who? The average family or us pixelpeeping videographers? I don't expect any of my wedding clients asking me for a hdr film in the coming years and I certainly am not going to upgrade all gear for a HDR workflow. More dynamic range doesn't mean a better film to me, it's not something I can use as a selling point to my clients because I would have to charge extra for it, they don't care if there is more information in the shadows or highlights, they just want to see a well shot and edited film with clear audio, they just want me to catch all those important or emotional moments, no-one is going to be disappointed if the shadows are darker or if the highlights if a window is blown out.

Cliff Totten
November 26th, 2017, 09:14 AM
Common for every TV sold 5 years from now. Nobody knows what "HDR" is today. But as time goes on consumer awareness will grow, just like it doing with "4k" today.

This has been the progression of TV for how many years now? Look at the history of TV itself.....

First TVs...."Its too expensive, Ill stick with my trusty radio, thank you".

Color TV..."No thanks, my single, wood cabinette, 15inch B&W model is all I will ever need. This color talk is just a passing fad"

Cable TV?..."Sorry, why would I pay for my TV when I already have 5 local channels. Rabbit ear antennas is all I will ever need"

VHS recorder?..."What for?...they are too expensive and I dont get how to even connect and use it."

DVD player?..."I'm fine with my VHS, who needs digital video anyway, it looks the same to my grandmother anyway."

Flat panel LCD screen?...."No thanks....I spent $800 on my Sony Trinitron glass tube TV 5 years ago and it still looks amazing. No need to throw it away "

Hi Definition?...."What's this HD thing Im starting to hear? I dont need it. My DVDs look fine. HD is an expensive thing created by the TV companies to make us spend money on their industry"

4k...."All this 4k talk is stupid. Nobody wants it, my grandmother says she cant even tell the difference, there are very few 4k TV models and they are not affordable anyway....there is ZERO 4k content anywhere. YouTube will NEVER do 4k and it's a silly fad that will never go anywhere...its going to die like 3D did."

Guys....havent we heard this all before? Every single TV advancement has ALWAYS begun with a rocky slow start. These types of things have ALWAYS had their critics and nay-sayers. "HDR" is no different.

It takes consumer awareness to drive us to change.

I was in Best Buy last weekend and was supprised at how many 4k models are out. 75% of all the models above 50 inches are now 4k. You can can buy easily buy a good 4k TV for under $500.

Many of these models have "high dynamic range" on their boxes ALREADY!

Guys,...this trend is not going to reverse itself, no matter how much anybody like to see that happen.

Rec709 is NOT the future, its the past. Once consumers LEARN what an HDR image looks like, they will like it because it DOES look spectacular.

(Im sure my grandmother will still not agree)

CT ;-)

Lou Bruno
November 26th, 2017, 12:29 PM
All I know is that I have a new home theatre....Samsung with HDR. Also an 4K APPLE TV and NETLIX.
HDR ROCKS!!!! All about the menu setting. You Tube and Netflix as well as Amazon offer SOME HDR
Movies. The color depth will knock your sox off.


Also have the FDR AX700...but that is for another day.

Cliff Totten
November 26th, 2017, 04:45 PM
Its just a matter of cost and consumer ignorance. Once the cost drops and consumer awareness rises, HDR will begin to thrive.

I'm guessing that in 5 years, 75% of all new TV's, large and small will all be HDR capable. Every MAC will have an OLED Retina display as well as all iPads and tablets too.

Rec709 is not going to last forever and CES 2018 will highlight ALLOT of new HDR offerings. Mark my words....watch how much "HDR" is talked about in January's CES.

Vimeo just announced 10bit compression with HDR, YouTube is doing it and Netflix is working on it too now. People WILL be watching HDR online before we know it and start asking if WE could give them that. Hell,...even the global (cable) media company I work for is investigating HDR possibilities today.

Again....dont hold on to rec709 so tightly....its going to be replaced someday.

Paul Anderegg
November 27th, 2017, 01:55 AM
While HDR and HLG are important future advancements, I think it is JUST AS IMPORTANT that means of exporting HDR/HLG clips to REC709 for broadcast be made a priority...we are already seeing very poorly done HDR encoding on current 4K BluRay releases.

Paul