Ken Ross
January 30th, 2014, 01:11 PM
Ron, so are you saying Edius7 will edit 4K XAVC-S natively but can't output XAVC-S? I thought it could and that's how Grass Valley promoted it.
View Full Version : Sony FDR-AX100 Pages :
1
2
3
4
5
[6]
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
Ken Ross January 30th, 2014, 01:11 PM Ron, so are you saying Edius7 will edit 4K XAVC-S natively but can't output XAVC-S? I thought it could and that's how Grass Valley promoted it. Jack Zhang January 30th, 2014, 06:29 PM I will be very interested to see if at NAB the RX10/FDR-AX100 sensor+a new lens gets mated to a FDR-AX1 body+back end. For lack of anything else call it the FDR-AX10. The optics of the Sony G Lens would have to be changed to use the 1'' form factor of the sensor, as the G Lens is currently only limited to 1/3'' sensor design at the moment. Ron Evans January 30th, 2014, 06:37 PM Ron, so are you saying Edius7 will edit 4K XAVC-S natively but can't output XAVC-S? I thought it could and that's how Grass Valley promoted it. Correct. Edius will edit both XAVC and XAVC-S but will not render to these formats. XAVC editing is smooth on my system but XAVC-S will not play smoothly at full frame rate which is why I have taken to converting to HQX to edit with much like I did when AVCHD came out a few years back. I do not recall Edius promoting rendering to XAVC or XAVC-S. You will need 7.21 to edit XAVC-S. Edius is my main editor but I do have and use Vegas Pro12 which will edit and output XAVC or XAVC-S Ron Evans Bill Koehler January 30th, 2014, 07:00 PM The optics of the Sony G Lens would have to be changed to use the 1'' form factor of the sensor, as the G Lens is currently only limited to 1/3'' sensor design at the moment. Well aware of that. It's why I said "+ a new lens". Ken Ross January 30th, 2014, 07:28 PM Correct. Edius will edit both XAVC and XAVC-S but will render to these formats. XAVC editing is smooth on my system but XAVC-S will not play smoothly at full frame rate which is why I have taken to converting to HQX to edit with much like I did when AVCHD came out a few years back. I do not recall Edius promoting rendering to XAVC or XAVC-S. You will need 7.21 to edit XAVC-S. Edius is my main editor but I do have and use Vegas Pro12 which will edit and output XAVC or XAVC-S Ron Evans I do have 7.21 Ron, so that's not a problem. I may have to add Vegas if GV doesn't upgrade to render to XAVC-S. Ron Evans January 30th, 2014, 10:07 PM Why would you want to render to XAVC-S ? Sony have been giving away a voucher for Vegas with the latest high end camcorders so maybe one will be with the FDR-AX100. Ron Evans Vaughan Wood January 31st, 2014, 12:35 AM Question for Ron Evans. As I do similar events to you Ron, and all of our events are currently mastered to SD DVD (PAL), do you feel the your 4K camera gives noticeable difference in detail when it finally gets back to SD? Currently we use Panasonic AG -AC130/160 series which are a much cleaner picture then we've ever had before. I too use Edius for all editing. Any difference in facial features when you have 30 odd kids on a dark stage would be worth $2000! Thanks, Vaughan Ken Ross January 31st, 2014, 06:05 AM Why would you want to render to XAVC-S ? Sony have been giving away a voucher for Vegas with the latest high end camcorders so maybe one will be with the FDR-AX100. Ron Evans Ron, for a 1080p project, I wouldn't. However, down the road when I have a 4K TV, I'd want to avoid generational losses for my personal footage as much as possible while maintaining 4K from start to finish. By avoiding a codec change in the editing process, and keeping things XAVC-S from start to finish, I'd think that would be my best shot. Ron Evans January 31st, 2014, 08:32 AM Once you have edited the file the NLE will regenerate so whether its the same or other codec is questionable. It will depend on the playback system then as to what codec is appropriate. The PC encode of XAVC-S will depend on the PC version of the codec and unlikely to be the same as the source which is hardware accelerated with a custom circuit. It will be different and could be better or worse !!! Ron Evans Ken Ross January 31st, 2014, 09:42 AM Ron, I agree, it's a crapshoot. It's a new codec and as with all new codecs it will necessitate some experimentation to see what works best. To be honest, we're not even sure what will play back the 4K project on a 4K TV just as we're not sure what will play back the output from the Sony AX100. We know for sure the AX100 will play back its output on a Sony 4K UHTV, but beyond that who knows? It's always fun being on the cutting edge...sometimes not so much. ;) Ron Evans January 31st, 2014, 11:38 AM I think the FDR-AX100 has HDMI 2.0 so should playback 3840x2160 to any TV with this capability. We shall see !! Ron Evans David Heath January 31st, 2014, 01:35 PM Once you have edited the file the NLE will regenerate so whether its the same or other codec is questionable. Currently with XDCAM it's possible with some NLEs to do what's called "smart render". Basically, the NLE will re-render around a cut edit for a few frames (to rebuild a valid GOP structure either side of the edit) and render anything like titles, mix etc - but the vast majority of the material is unchanged from the original. I'd expect the same to happen with XAVC-S. Ron Evans January 31st, 2014, 01:47 PM Currently with XDCAM it's possible with some NLEs to do what's called "smart render". Basically, the NLE will re-render around a cut edit for a few frames (to rebuild a valid GOP structure either side of the edit) and render anything like titles, mix etc - but the vast majority of the material is unchanged from the original. I'd expect the same to happen with XAVC-S. Yes I expect this may come but once the edit includes color correction or cropping it means the whole has to be encoded again. At the moment it seems that the file is encoded again even in Vegas. Ron Evans Pavel Sedlak February 4th, 2014, 01:00 PM It is of course easier to provide the long zoom with the smaller sensor(s) of the FDR-AX1. And the FDR-AX1 of course has XLR's built in from the get-go. I find this time of year impossible to make a decision. Every year I wait for the two big North American shows, CES and NAB, and then start making camera decisions. I will be very interested to see if at NAB the RX10/FDR-AX100 sensor+a new lens gets mated to a FDR-AX1 body+back end. For lack of anything else call it the FDR-AX10. Yes, new lens, but for 4K it is not easy to make on wide angle "sharp cornes". I tried the small JVC 4K camera last year and the lens was the most interesting part - slow and not wide. I was also very happy two years ago that new HD cameras have better LCD and EVF, but for focusing 4K we are back in that old problem - LCD has 1280x720pix for 4K focus. Also my HW is very happy now, small hardiscs without power adapters work well with my notebook (i7), but 4K needs upgrade or proxy. I think that I can still wait one or two years then 4K start something for me. But new cameras are always so nice, I like them .-) . Uwe Boettcher February 5th, 2014, 06:39 AM ????? ?FDR-AX100????? - YouTube Phil Lee February 11th, 2014, 01:54 PM Hi Let's toss around this AS100 codec for a minute. 3840×2160 @ 30P using only 60Mbp/s? If we compare this to 24Mbp/s (which we know can be fairly robust) at 1920x1080, mathematically this "seems" to be an H.264 "downgrade" in motion handling and durability. So, pound for pound, 4K at 60Mbp/s would equal 15Mbp/s in 1080. (if you think of 4K in HD quadrants) XAVC-S has something extra thrown into the mix. It runs at the very highest H.264 "5.2" level profile. What about AVCHD? It's set at high profile but only 4.0 so it uses a lower H.264 toolset. So here is the technical question for anybody out there that might know: Does the increased XAVC-S H.264 5.2 profile justify the lower bitrate used in 60Mbp/s 4K at 30p? Does the additional tools and calculations cover that bitrate loss? Can XAVC-S provide at least an equal amount of motion handling at 4K, 29.97p running at 60Mbp/s as 1080 does at 24Mbp/s in 29.97p?....because it uses a deeper tool set? Can anybody with a Handycam AX1 shoot 10 seconds of a scene with a modest amount of movement and upload it somewhere for people to test? This new AX100 60Mbp/s only bitrate might be the biggest question about the overall image quality of this new little camera. At $2,000 I find it hard not to make it my first jump into 4K. For a full manual camera, there is not much to lose and pretty much forces me to get into 4k sooner than I originally planned. (I'll prolly buy a "pro" 4K camera next year but start "playing" with this one in 2 months) CT The level denotes the maximum options and "best" feature set allowed and is a way of describing the hardware required to decode, it guarantees nothing about the encoding. For example a media player typically supports Level 4.1 in H264, and level 4.1 has maximum resolution of 2048x1024 at 30fps and at 1280x720 it supports up to 68.3fps, notice how this fits with the Blu-ray format, i.e. Blu-ray is limited to 30fps at 1080P (of course this may be as fields so 60i), but will do 60fps at 720P. Blu-ray requires a Level 4.1 decoder. The next one up, level 4.2 means a decoder must support up to 60fps at 1080P, and so AVCHD Progressive requires a decoder that handles Level 4.2, this is why support for 60fps at 1080P is much more limited in hardware decoders as hardware decoders typically only needed to go as high as level 4.1 before its arrival. Some people may quote Level 5.2 for the AX100 but this only relates to Sony's branding of H264 as XAVC which can go all the way up to 60fps at 4K. In the AX100 then 4K is level 5.1 (which is all that is needed to support their 30fps at 4K), and HD is Level 4.2 still. If you take the Panasonic GH4 which has the option to record at 200Mbits/sec All-Intra makes HD on the GH4 Level 5 (only Level 5 decoders are guaranteed to cover that bit-rate), but it is still Level 5.1 for 4K HD. So to decode the GH4 200Mbits/sec All-Intra 1080P we would need a hardware or software decoder that supports Level 5, but for the Sony AX100 HD Level 4.2 will do. Of course at the moment there are no hardware decoders that will decode Level 5 or Level 5.1 available to the consumer. 4K HD TVs presumably have a level 5.2 H264 hardware decoder built in if they support playback from media. If the spec for an expensive UHD TV only says Level 5.1 for H264, we know it is limited to only playing back 30fps @ 4K, so has built in obsolescence. There are other things called profiles, which typically cover things like colour sampling and further define maximum bit-rates. Profiles can be thought of as further refinement that allows hardware to target consumers, enthusiasts or professional applications within that Level. Hope that helps explain it, more info on the Wiki H.264/MPEG-4 AVC - Wikipedia, the free encyclopedia (http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC) Regards Phil Ken Ross February 11th, 2014, 02:23 PM Phil, the new 2014 Sony UHD TVs clearly state they support 4K@60p. To me that's more reassuring than worrying about what 'level' they do or don't support. Clearly stating 4K@60p removes any and all doubt. :) Phil Lee February 11th, 2014, 03:58 PM No it creates doubt. What is the maximum bit rate? Can it decode 4.2.2, does it support 10bit? Only publishing the level and profile removes doubt. I'd be surprised if it isn't level 5.2 however. Piotr Wozniacki February 11th, 2014, 04:02 PM Phil, the new 2014 Sony UHD TVs clearly state they support 4K@60p. To me that's more reassuring than worrying about what 'level' they do or don't support. Clearly stating 4K@60p removes any and all doubt. :) No way for me... Phil Lee February 11th, 2014, 04:08 PM Hi Just to add I'm not sure if all or if any Sony 4K TVs support HDMI 2? So you will be limited to 4K at only 30fps from any external sources, not exactly future proofed unless they are firmware updateable to HDMI 2.0. Buyer beware. Edit, just downloaded the i-manual for XBR900 Sony 4K TV and under supported codec list it only goes up to Level 4.2, in other words it is saying no support for 4K media files at all, seems odd. Regards Phil Ken Ross February 11th, 2014, 06:00 PM The new Sonys clearly state they support 4K@60p. Just look at the website for the 2014 models. They all support HDMI 2.0. In fact the older models got a firmware upgrade to 2.0. Phil Lee February 12th, 2014, 11:39 AM Hi The new to be released models from Sony still don't support the full potential of 4K, the specifications note that 4K at 60fps is only supported at 4.2.0 and 8 bit so no option to get more colours with 10bit or 12bit 4K or for better colour sampling like 4.2.2. It also looks like the internal media player is still HD only still, so no play-back of 4K encoded material via the network or USB. You'd struggle with any decent high-bit rate material anyway as the Ethernet is only 100Mbits/sec. You'll have to wire up your Sony AX100 to the TV and hope you can get finished edits back onto a SD card in a format that will play from the camera. These early 4K TVs are 4K panels with 2K electronics struggling to keep up. Sony isn't alone in having their 4K sets not feature complete for 4K at this time. The manufacturers know that early adopters will pay over the odds and be prepared to replace their equipment once the finished products arrive, it is what pays for the R&D I suppose so we shouldn't complain. Regards Phil Ken Ross February 12th, 2014, 12:55 PM Not quite correct Phil, the new Sonys will play internet 4K content via the onboard apps such as the upcoming Netflix 4K streaming. As far as 4:2:2, have you seen these 4K displays? Color is absolutely gorgeous, regardless of how it's produced. Much nicer than 2K. Beyond that, what content have you seen at 4K/60p/4:2:2? Lots of reasons to like the new 4K displays, but if you're determined not to buy, you can find those reasons too. Some might not buy a GH4 because it can't do 4K@60p. Nothing is perfect Phil, nothing. Phil Lee February 12th, 2014, 02:33 PM Hi The 4K content it will play is low bit-rate streamed content, it will hardly be 4K even if it is 'delivered' as 4K. Will it play any H264 4K media file at 60 or 100Mbits/sec from a flash card or over the network? No, looks like it doesn't. Will it play a 10Mbit/sec stream claiming to be 4K from Netflix, yes it will. Resolution doesn't make a difference to the colour. If a TV is calibrated correctly and can display the required colour space it will look as good as identical to the next, regardless of resolution. It's like my own computer monitors calibrated using a colorimeter, while they look totally different at default non-calibrated settings and you could argue out of the box one is better than the other, once calibrated they look identical, which of course is the whole point. If you want to get into a conversation of Red looks redder, Green's look richer, Yellows look brighter, then here is probably not the best place to do that as all we would be discussing is carefully grafted demo's of TVs with "buy me" shop settings. What we know as fact is that 4K as it is currently being offered in these consumer cameras and TVs holds no more colour information than HD. I've not said anything is perfect, I'm just being realistic, 4K TV's and camcorders/cameras are coming up short. The manufacturers are doing what they always do, they did it with HD as well, and the first products are not the final product. There is nothing wrong with waiting it out, and there is nothing wrong with taking the plunge straight away if you have funds and want to. Let just be honest and not afraid to challenge and question and discuss the limitations of what is currently on offer. We all know in 12 months time all the 4K kit being sold now will look very dated and limited in their ability to showcase 4K at its best. Regards Phil Ken Ross February 12th, 2014, 04:22 PM That's OK Phil, you're obviously not a fan of almost anything 4K as it exists today. I've seen your posts in other threads and that commonality runs through them. I happen to be a fan of the new crop of equipment and I fully realize their limitations. Equipment will always have limitations. However the picture I see on the current crop of 4K TVs (and the new ones coming as seen at CES), with their limitations, greatly surpasses anything you or I now own. The video I've seen from the AX100 also greatly surpasses anything I've shot with in many ways as will, I'm sure, the GH4. We could argue until the cows come home about what current 4K equipment has or doesn't have and we'll settle nothing. The bottom line is, if it looks much better to my eyes than anything 2K, that's all I'm concerned about. Remember Phil, there will always be something better, there will always be a next gen with additional features and capabilities. Two years from now, the models released then will make next year's models look dated. Life is short my friend, at some point we stop waiting. If you're waiting for perfection, it will never come, of that I'm sure. :) Dave Blackhurst February 13th, 2014, 01:36 AM Specs and engineering standards tell part of the story, but we don't see or hear "specs", we see pictures or video, and we hear sound. In the end, does the final result look or sound noticeably better? Even if the "specs" say it "shouldn't", sometimes it does... there is more to life than numbers... conversely, sometimes the specs say it should be better, and it isn't... Based on the RX100 and RX10, I'm thinking that the AX100 (and CX900 for that matter) will produce higher bitrate 1080/60p that will be better than any other camera at that price point currently available... even if it's not "perfect". The 4K is a bonus, and even if I'm not a fan of 30p... it may prove usable, despite the "numbers"! I'm sure the first couple generations of 4K TV's will be "OK", but will be surpassed a couple years down the line. Tech gets better and/or cheaper, you don't like it where it's at, shut up and wait... but you'll probably be complaining when it's "better" and cheaper too. Me, I'll take it as it comes and shut up and shoot fun cameras like the RX100 and RX10, and whatever comes next... like the AX100 Ken Ross February 13th, 2014, 06:59 AM Dave, I have preached the danger of 'specs only buying' for years in both video and audio forums. It amazes me how many people still use specs as their sole criteria for judging, ruling out otherwise great equipment with what appears to be 'lesser' specs. How many speakers can have frequency response curves that look essentially the same, yet sound totally different. I'm 100% convinced you could take some people who previously railed against the AX100 for lack of this or that, and in a blind test tell them the picture they were watching was from a camera that produced RAW video with 4:2:2 color and I'd bet they'd gush over the picture despite the fact it was actually the AX100. Preconceived notions can distort both perception and reality. Bill Koehler February 13th, 2014, 01:24 PM ...Life is short my friend, at some point we stop waiting... I will stop waiting right after NAB...these next couple of months are going to be long... Cliff Totten February 13th, 2014, 02:08 PM Im sure our grandchildren will be complaining how horrible 8k is at 4:2:0 @ 8bit. Im sure they will hate the quality of their 8k green screen keys at 4:2:0 and how they need more chroma resolution to pull off what they want to do. Ok...10 bit and 8bit 4k, yes I get that argument. But 4k 4:2:0 has a TON of color information. I seriously would not be able to spot 4:2:0 video vs 4:2:2 at 4k resolution in an A/B test at all. It really looks the same to me. And keying 4:2:0 4k, Im sure will give more than great results. Lets not get lost entirely in "numbers" alone. Pavel Sedlak February 13th, 2014, 02:16 PM Dave, I have preached the danger of 'specs only buying' for years in both video and audio forums. It amazes me how many people still use specs as their sole criteria for judging, ruling out otherwise great equipment with what appears to be 'lesser' specs. How many speakers can have frequency response curves that look essentially the same, yet sound totally different. I'm 100% convinced you could take some people who previously railed against the AX100 for lack of this or that, and in a blind test tell them the picture they were watching was from a camera that produced RAW video with 4:2:2 color and I'd bet they'd gush over the picture despite the fact it was actually the AX100. Preconceived notions can distort both perception and reality. The specification of camera has its place if you are planning the type and amount of the postproduction work. I think that only persons with the long professional color grading experiences can tell you how good is the record from the certain camera during this grading process. HDV was really wrong, PDW700 is nice, PMW350 has its limitation, cine F3 (slog) is very good (even XDCAM 35Mbps 420 offline version can be ok), and so on. It is not about one thing (lens quality, chip quality,...), this is about all small things under specification, from the quality of chip to the quality of DSP processing. The specification can tell you some basic expectations but only real work with camera can tell you more - if you make a big color grading (and clips were shot in extreme light situations). If you work in good weather (sunny days) and only need put pictures to YT than you can't see differences between good and bad cameras (10bit vs. 8bit or 420 vs 422). Sometime can 420 colors be better than 422 from the wrong camera .-) , but there are the others really important parameters. The result can be in 420 and 8bit on TV screen, but good camera let you make the much better work in postproduction - but quality is not cheap and your choice depend on your needs. David Heath February 13th, 2014, 06:52 PM The new to be released models from Sony still don't support the full potential of 4K, the specifications note that 4K at 60fps is only supported at 4.2.0 and 8 bit so no option to get more colours with 10bit or 12bit 4K or for better colour sampling like 4.2.2. I have to ask what are the "more colours" that 4:2:2 and 10/12 bit sampling are going to give you in 4K that you don't get with 4:2:0 and 8 bit? Look at it this way, all current HD broadcast and Blu-Ray is 8 bit and 4:2:0 encoding. Can anybody say they've ever watched a Blu-Ray of a feature film at home and been aware of that? Now move up to 4K and think on this. For 4K 4:2:0, then the chroma information alone is 1920x1080 - same as the luminance in HD! And it's twice the amount of chroma samples in 4:2:2 1080p! Given that the human eye has less resolution to colour than luminance, do you really think you'll notice the difference on any sort of normal viewing, even with a large screen? As for "more colours", then well, I suppose 10 bit or more theoretically does give a greater range of tones. But again, the human eye won't be able to perceive them. If you disagree, then try the test - Color Test - Online Color Challenge | X-Rite (http://www.xrite.com/custom_page.aspx?pageid=77&lang=en) Still think the "extra colours" that 10 bit gives you over 8 bit are worth it?! :-) (Come on, what score did you get.........?) Now this is not to say that better colour sampling and higher bitdepths than 8 bit don't have relevance - they do - but it's really only when you come to post manipulation that they have real significance, and in the case of bitdepth far more when used with something like s-log or RAW. And they need a first rate camera front end to bring much to the table anyway. Unfortunately, terms such as 4:2:2 and 10-bit have taken on a life of their own. In the world of SD, 4:2:2 certainly made a big difference, but it's far less in HD, and less still in 4K - even less when you move away from true pro cameras. Ken Ross February 13th, 2014, 07:55 PM Great assessment Dave! It's the old case of people always thinking 'more is better'. As you say, in certain cases yes, but in most cases it's not going to mean much anyway. But if you're looking for an excuse not to buy, this just makes some feel better. :) Jack Zhang February 13th, 2014, 07:58 PM The difference that really matters is 10bit, high dynamic range and wider color gamut. Dolby Vision with localized high dynamic range is part of the equation that requires a higher bit depth, with Rec. 2020 gamut mapping also part of the equation. You can do all of this in 4:2:0 without too many problems, but we will have to move from 8bit to 10bit to truly realize these parts optimally. I agree on a small screen 4:2:0 shouldn't matter at all. Home theater can remain in 4:2:0 for quite a while, since people are moving more towards convenience rather than quality. Netflix 4K is likely going to be a very popular choice over the Sony server based player. When you get into theatrical, that's when 4:2:2 and even 4:4:4 matter, with digital intermediates for VFX and etc. Tim Lewis February 13th, 2014, 09:41 PM Cat 5e or 6 with appropriate switching and NIC's is 1000 Mbps. Dave Blackhurst February 14th, 2014, 04:03 AM I took the color bar test and got a 28.... More cowbell... Paul Rickford February 14th, 2014, 04:31 AM Great assessment Dave! It's the old case of people always thinking 'more is better'. As you say, in certain cases yes, but in most cases it's not going to mean much anyway. But if you're looking for an excuse not to buy, this just makes some feel better. :) Yes let's not forget the main clue to the specification is in the HANDYCAM badge on the side! Here NOW in 2014 we have a Handycam with full manual control, improved HD codec and OLED viewfinder etc. with entry into 4K thrown in almost as a bonus. After Handling this at CES, I for one can't wait, I wonder-has any one been thrown a firmed shipping date yet?? Ken Ross February 14th, 2014, 05:57 AM Paul, Amazon is saying 3/19 if I recall correctly. Sony is usually pretty good with meeting product release dates. Ken Ross February 14th, 2014, 06:03 AM I took the color bar test and got a 28.... More cowbell... My feminine side won over, I got an 8. I know my color perception has always been pretty good. I remember doing that test a year ago and got a similar score. My wife got a 5, that surprised me. Hey, maybe I do need 4:2:2! Dave, you can make do with B&W. :) Bruce Dempsey February 14th, 2014, 08:24 AM Your score: 11 0 ( Perfect Color Acuity ) 99 ( Low Color Acuity ) Color Test - Online Color Challenge | X-Rite (http://www.xrite.com/online-color-test-challenge) Pavel Sedlak February 14th, 2014, 11:17 AM ...But 4k 4:2:0 has a TON of color information. I seriously would not be able to spot 4:2:0 video vs 4:2:2 at 4k resolution in an A/B test at all. It really looks the same to me. And keying 4:2:0 4k, Im sure will give more than great results. Lets not get lost entirely in "numbers" alone. You are right that 4K has more color samples than HD, so 4K has the better start position for colors interpolation from 420 than HD. But 422 (or 444) color subsampling is not important for your eyes, this is wrong answer for our question. The 422 (or 444) color sumbsampling is only the one small part of camera qualities, you need quality lens, chip, A/D converter, all DSP circuits must have the best quality (including the color interpretation), audio part..., coder, compression for record. The specification is only the basic view, I know a lot of differences between cameras with similar specifications. 10bit sampling and 422 chroma subsampling are inportant for big grading in postproduction, it is input for computer, not for your eyes .-) . It has also nothing to do with the quality of your eyes, with poor signal and with big color grading you will quickly see the differences. I think that 4K 420 will be much more often in the low class of professional camcorders then in HD. Our brain will be surprised by the unusual sharpness from the beginning of 4K era, but quickly gets used. After this input we will see the same differences - natural image with really nice colors or unnatural image with an eletronic sharpness and poor colors. Tha natural look is the most important part for viewers, more than sharpness .-) (the high contrast in details). Dave Blackhurst February 14th, 2014, 04:42 PM I'm going with it's time to replace/upgrade monitors and video card... yeah, that's it, not my old eyes! Ken Ross February 14th, 2014, 07:41 PM I'll support you on that Dave. ;) Dave Blackhurst February 18th, 2014, 05:12 AM Redid the test, got an 11... on a non color corrected LCD no less... though definitely newer than my "main" color corrected screens... hmmm.... Still say "more cowbell".... And I still say that the new cameras will likely have "eye popping" image quality, and good looking color, sharpness, etc... however "deficient" the specs may be! I have yet to feel "disappointed" by anything I've shot with the RX10... it keeps calling to me to shoot more... Ken Ross February 18th, 2014, 06:46 AM Dave, I think the 11 score has moved you out of the 'color blind' category and in to our elite club where members can clearly differentiate red from green from blue. :) Seriously, it's so easy to miss a few subtle shadings and color transitions on that test....and then there's the issue of the monitor you use during the test. I agree though, I firmly believe the AX100 will surprise many just as the RX10 did. The RX10 has gained the respect of almost everyone out there. With what is essentially the same sensor and processing but with 4K, many additional features and an ergonomic form more befitting a video shooter, I'll be surprised if it isn't a hit. David Heath February 19th, 2014, 02:56 PM Seriously, it's so easy to miss a few subtle shadings and color transitions on that test....and then there's the issue of the monitor you use during the test. And all joking apart, that's why I linked to the test. If you import a screenshot into Photoshop, you can use the picker to see the R,G,B values of the individual blocks - typically, they are a couple of values (in the 8 bit world) apart. As Ken says "it's so easy to miss a few subtle shadings ....." - and that's with those differences. (They typically represent 7 bit resolution.) And it's without noise, especially moving noise, which typically masks gradation differences. Do the test with some noise added to the blocks, and expect the score to go far higher....! :-) So who still thinks 10 bit will make much difference over 8 bit in terms of seeing extra colour gradation on normal images? :-) Yes, it makes for a difference in the post process - but only if the source material is of extremely good quality in the first place. That's why unduly worrying about 10 bit on this class of camera is pretty pointless. Ken Ross February 19th, 2014, 03:20 PM Great points Dave, thanks. Cliff Totten February 19th, 2014, 05:13 PM This is strictly a rumor...however I read a post on a Sony board about the AX100s HDMI output. The guy was posting claims that a Sony rep told him that the HDMI port is NOT a clean output. Again...no way to know if this is B.S. yet. That might be completely false. If it us true, than this "could" be the smoking-gun crippling trick that we have been waiting for with this camera. "If" this is true, than there will be no way to ever get more than 60Mbp/s long GOP out of this camera...ever. Future 4k external recorders will be useless on this. Again. No way to know yet until it hits the streets. Lets hope this is a lie. Or Sony changes its mind quickly on this. Ken Ross February 19th, 2014, 06:31 PM Really not sure how many would be using an external recorder with this camera. As for 4K external recording, your choices are one. So I guess I'm dubious as to how many potential buyers would be deterred if this turns out to be true. David Heath February 19th, 2014, 06:50 PM Really not sure how many would be using an external recorder with this camera. I agree, and if I was on the design team for the camera, I'd be hitting my head against the wall. I'd feel like I'd designed a superb small, budget car, which everybody agreed was far better than anything in it's class or price range had ever been before - and I started to read comments along the lines of "it doesn't have the performance of a sports car though?" and "you can't even put a sports car engine in it!?" All design - be it cars, cameras, washing machines or whatever - has to be a series of compromises. Do you (in the case of a car) design for off road ability? Capital cost? Fuel economy? Comfort or performance? Two seater and sporty, or can take the family - or big load carrying? There's no right or wrong (from a designers point of view) but different products will appeal to different buyers. FWIW, I think this seems well designed *for the market it's aimed for*. Worrying about lack of features you may expect on a far more pro camera (10 bit, 4:2:2 etc) seem a bit like a farmer buying a 2 seater sports car, and complaining about it's lack of load carrying ability and off road performance......... :-) Wacharapong Chiowanich February 19th, 2014, 08:05 PM And on the contrary if this turns out to not be a lie and assume it to be 10-bit 4:2:2 of some sort then you could almost write an obituary for the AX100 and PZ100. Sony's got to be thinking hard about this before that March date arrives. |