|
|||||||||
|
Thread Tools | Search this Thread |
July 19th, 2007, 05:09 PM | #1 |
Major Player
Join Date: Nov 2006
Location: Victoria BC
Posts: 400
|
Samsung LCD 8-bit or 10-bit video processor
Samsung makes two levels of a similar 40" 1080P LCD - one that has a contrast of 10,000:1 and an 8-bit video processor, and a higher end version with 15,000:1 contrast and a 10-bit video processor. Both are HDMI 1.3 spec.
Here's the higher end 40 - the LN-T4065F: http://www.samsung.com/ca/products/t.../ln-t4065f.pdf It doesn't explain how important Samsung's 8-bit vs. 10-bit video processors are.. It's a $400 difference for me, so I'd like to be pretty sure.. If I use the 10-bit monitor for colour correction/editing, etc will it truly be a step up from the 8-bit and reduce banding as I might hope it will?
__________________
Mac + Canon HV20 |
July 19th, 2007, 10:11 PM | #2 |
Inner Circle
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
|
Considering it's a consumer TV, it may not be good for color correction if it does wacky things to your picture. For example, what exactly does its "Wide Color Enhancer" do? Any so-called enhancement is likely a bad thing.
|
July 19th, 2007, 10:40 PM | #3 |
Inner Circle
|
Hi Robert........
I may well be wide of the mark on this and I'm sure if I am someone will let us know pdq, but as far as I know, yes - the 10 bit video processor (if it's the bit that processes the colour) will make a significant difference BUT - only on material that contains 10 bit colour.
The rub there is, I am not aware of anything but very high end pro gear that handles anything other than 8 bit (at the current time). So, will you see a difference doing your editing, my guess is, at this point in time - no. As for colour correction etc I guess it can't be any worse than your average HD telly is - tho' that entire subject is rather amusing to me 'cos I do my colour correcting on a HD telly, and am as colour blind as they come, and no ones complained yet! (Hey, I figure that if it looks ok on my HD set it's gotta look reasonable on everybody else's!) CS |
July 19th, 2007, 10:53 PM | #4 |
Inner Circle
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
|
HDMI 1.3 supports 10-bit color over the HDMI interface.
10-bit could also refer to the bit depth of the LUTs in the monitor, or to the D->A driving each pixel element. 2- The important thing is that you need a monitor that doesn't hide problems (e.g. not enough resolution) and that you aren't correcting for problems in the monitor (e.g. modes that pump saturation to make the image look 'better'). |
July 19th, 2007, 11:15 PM | #5 |
Inner Circle
|
Hey Glenn...........
Do you know of anything actually putting out 10 bit colour at the moment? By which I mean DVD, Blu - Ray, HD DVD, HD transmissions, Graphics cards etc.
BTW, I'm not saying I know there isn't any, I asking if there is, 'cos I haven't heard of any is all. CS |
July 20th, 2007, 12:20 AM | #6 |
Major Player
Join Date: Nov 2006
Location: Victoria BC
Posts: 400
|
Thanks for the replies..
I've been doing some reading - it's an HDMI 1.3 TV, so in theory, the bandwidth in the connection far far surpasses what any HD LCD could do regardless. I'd do my best to make the monitor 'flat' and correct - I'd want to see into the blacks, so the 15,000:1 ratio would be for perhaps, on occasion viewing what the average joe would see... consumer yes - affordable, also a yes. I'd like to skip an 8-bit only display - I *fully* intend to have this directly connected to the 14-bit out (!!!) on my Decklink HD Extreme card, and yes, will be working with Adobe After Effects on say, 10-bit ProRes 422 imagery (I do graphic design and motion graphics in addition to video production). So, safe to say, 10-bit will be better than 8-bit, yes?
__________________
Mac + Canon HV20 |
July 20th, 2007, 01:26 PM | #7 |
Inner Circle
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
|
SDI is 10-bit. This is helpful when converting from RGB <--> Y'CbCr, since those conversions pick up rounding error. With more bits, the rounding error is less. For acquisition, 10-bit is helpful for images with a large dynamic range that you want to later manipulate.
2- For home video, there are plans to eventually introduce wide gamut color. Wide gamut colors would be colors that lie outside Rec. 709 gamut... with wide gamut, you can have extremely saturated/pure colors. Things like lasers and diffraction (e.g. the rainbow reflection off a CD/DVD) can produce these extremely saturated colors. And to do wide gamut color, you want 10-bit to represent the much greater range of colors. But when wide gamut takes off (if it even does), monitors will likely be much better then and you'd likely be looking at replacing yours. 3- The 15,000:1 ratio is likely a lie / extremely fudged spec. It is probably derived from the maximum luminance compared to the minimum luminance, counting the backlight. So for maximum luminance the backlight is turned all the way up. And for minimum luminance the backlight is turned all the way down. But in practical situations, you would never vary the backlight like that (unless the panel is a IMLED display; I don't believe IMLED is viable for broadcast monitors right now / no one is doing it yet). Anyways, it's a pretty meaningless spec. |
July 20th, 2007, 03:09 PM | #8 |
Major Player
Join Date: Nov 2006
Location: Victoria BC
Posts: 400
|
Thanks Glenn, et all.
I own dual 19" NEC LCDs - very sharp, great colour, no banding, no dithering.. use them for pixel-based graphic editing and compositing and graphic design. Had them for almost two years now.. just found out today they're 6-bit, rather than 8-bit. Wow.. I had no idea. They're really so good at hiding it - I can spot most artifacts and noise in images faster than most people, but never saw 'problems' in my monitors till I took a magnifying glass to the LCD today! Lol. It’s nothing that’s affected my work – maybe if I was doing 3D ray tracing I’d need a better set of monitors. So suffice to say, if my monitors are very good for my purpose at 6-bit (and they can't really show a full 8-bits regardless) there'd be no advantage going for a monitor that inflates itself as being 10-bit - it's marketing. 8-bit would be fine for the time being. I'll go for the 8-bit model at a larger size, or wait. Thanks guys - you've all been very helpful.
__________________
Mac + Canon HV20 |
| ||||||
|
Thread Tools | Search this Thread |
|