[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?



On Sun, 02 Oct 2011 13:53:02 +1300
Greg Ewing <greg.ewing@xxxxxxxxxxxxxxxx> wrote:

> Now, suppose our arithmetic is a little inaccurate, and we
> actually get x = 0.499. The boundaries then come out as -0.001
> and 0.999. If we round these, we get 0.0 and 1.0 as before.
> But if we floor, we get -1.0 and -0.0, and the pixel goes
> off the screen.
> ...
> This may be the reason we're misunderstanding each other --
> you're thinking of (0, 0) as being the *centre* of the top
> left pixel in the window, whereas I'm thinking of it as the
> *top left corner* of that pixel.

Although I was not the recipient of the original answer: Very
interesting! that's actually the first time I understand why using
the upper-left coordinate system may make sense under certain
conditions. :)

Yet, while I followed the math through, I am unsure about how bad small
inaccuracies might turn out. Those inaccuracies would essentially be
the fruit of scaling down a very large physical model to screen
resolution, so I truly wouldn't care if I would expected my
sprite to be 1px to the left or what it appears, as far as the
model performs accurately. For game interface elements (where visual
alignment might be more relevant) I probably wouldn't use scaled
surfaces anyway.

But again... interesting to debate. Could be another parameter for the
surface (coord_system=PIXEL_CENTRE | PIXEL_BOUNDARIES)!! :P

/mac