[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?



Christopher Night wrote:

Not that it's the most important thing in the world, but I think that OpenGL agrees with me. Here's a program that creates a scaled surface extending from (0,0) to (1000, 1000), with a canvas size of 100x100 pixels (so the scale factor is 0.1). It then draws the following horizontal lines:

Red line at y = -4 (will appear if we're rounding but not if we're truncating)
Blue line at y = 6 (will appear)
White line a y = 500 (will appear)
Red line at y = 996 (will appear if we're truncating but not if we're rounding)
Blue line at y = 1004 (will not appear)

When I run the script, I see a blue line at top and a red line at bottom, which is the correct behavior if we're truncating. But feel free to tell me if there's something wrong with this experiment.

I think what's happening here is that OpenGL is treating the
coordinates as pixel boundaries, and treating the line as extending
half a pixel either side of the zero-width line between its
endpoints.

In post-transform coordinates, y = 0.0 is a borderline case --
exactly half of each pixel is in the window. For y = -0.4, less than
half is in the window, so no pixels get drawn. Similarly for
y = 100.4. For y = 99.6, most of each pixel is in the window, so
the line appears.

So the same principle applies here -- work out where the boundaries
of the filled area are, and round them to the nearest pixel
boundaries.

And again, the reason it seems like it's truncating is that it's
applying a half-pixel offset from the coordinates you provide to
find the edges of the line, and then rounding those. (At least
that's what's happening conceptually -- the actual arithmetic
it's performing might be different.)

--
Greg