[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [pygame] 10 bits per color



On Fri, Oct 2, 2009 at 12:20 PM, Greg Ewing <greg.ewing@xxxxxxxxxxxxxxxx> wrote:
> James Paige wrote:
>
>> So by that measure, average human eyes should not be able to tell the
>> difference between RGB888 and RGB101010
>
> There could conceivably be an advantage in terms of
> dynamic range to using more bits, if the display
> device is capable of it. You could get the same
> effect using some kind of logarithmic encoding
> instead of a linear one, but that would complicate
> image processing.

Where's a linear encoding? sRGB (ie. the RGB888 encoding used on 99%
of standard displays)
is encoded nonlinearly with approximately gamma 2.2.
Though I suppose you could encode RGB888 with an even higher gamma in
order to increase dynamic range further, at the cost of harsh
quantization.

While I'm being silly, I could also say that a superhigh gamma based
encoding might actually work quite well if you combined it with
something like Amiga HAM (to make the actual meaning of a given pixel
relative to the decoded light level of the pixel to its left, and
hence compensate for the reduced precision.)

I think it's also true that there are some RGB101010 displays that can
actually display color values outside of standard sRGB (after all,
standard sRGB only covers 55% of the Munsell Color System samples,
notably omitting some extremely saturated secondary colors (eg orange,
purple)). Such displays definitely have more dynamic range available,
unlike common LCDs with high contrast ratios (which distort the sRGB
colorspace to seem more vivid instead of actually encoding +
displaying in a fuller format) In this situation you have to be
careful to not fall into the trap of thinking the minimum and maximums
of sRGB (RGB888) and RGB101010 can be treated as equivalent.