[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [pygame] Feature discussion, proposals



On 19 March 2015 at 15:28, Mikhail V <mikhailwas@xxxxxxxxx> wrote:
>>> - how does the indexed image is generally rendered? Is it like: it sends
>>> index array to video card and the card makes substitution?
>>
>> Personally I would forget about optimisation and
>> only use indexed surfaces if it simplified my program
>> logic. For performance, I trust convert() and
>> convert_alpha() to know more about the system it's
>> running on than I do.
>>
>> --
>> Greg
>
>
> Okay, here is a simple example, which shows some odd behaviour in pygame.
> I have initialised display and some surface, like this:
>
> DISP = pygame.display.set_mode((w, h), 0, 8)
> window = pygame.Surface((w, h), 0, 8)
>
> Now I set palette for DISP then put data from uint8 array directly on DISP:
>
> bv = DISP.get_view("0")
> bv.write(myarray.tostring())
>
> Everything is shown OK. But if I put my array in 'window' first and
> then blit it to DISP I get garbage.
> HOW IS THAT?
> I can see correct values only if I explicitly set same palette to
> 'window'. So it is contrary to any expectations.
> I can understand if my DISP were rgb 24 or 32 bit surface then it
> should always resolve colors by blitting. But how does my data
> go corrupt if I blit between 8 bit surfaces and 8 bit display? Those
> are just an array of indexes.
> What I am missing or doing wrong? If it is normal behavior then it
> seems to be wrong, since it performs some parasitic operation on my
> array which also result in false picture on the end.
>
> Regards,
> Mikhail

Just update to my observations
With convert() applied to my 'window' surface it works also correct.
Still I am a bit confused, why the source surface need to be converted
if it created with bitflag depth same as display.