[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [pygame] Pygame and OpenGL



On Mon, Nov 05, 2001 at 11:26:36AM +0200, Jan Ekholm wrote:
> On Mon, 5 Nov 2001 crispin@iinet.net.au wrote:
> 
> >On Mon, Nov 05, 2001 at 09:56:47AM +0200, Jan Ekholm wrote:
> >You can also do texture weighting and other cool gfx memory management
> >in OpenGL 1.1
> >
> >I have used the pygame routines to render things to a SDL surface, then
> >used the Image's tostring() to turn it to a raw string to be imported
> >into a GL texture.
> 
> Hmm, how well would this work for frequently changing data that changes,
> say, once every second? I assume that the best way is still to get a panel
> and other status data done as normal textures blitted in glOrtho, as you
> said. It's a little bit more work to maintain those textures than would
> normal 2D pygame graphics.

The real speed with the textures in OpenGL comes from them being resident in the GFX cards memory, which allows the hardwares transformation and rendering engines to work directly with them. If you drew into a software surface, the two slow bottle necks would be tostring()ing it and the OpenGL driver transfering it to gfx ram across the bus. 

> Hmm, does anyone have a really simple pygame-based application that just
> creates a OpenGL surface and draws something really simple on it? I'm not
> sure the pygame contexts that come with pyopengl are what I want. I have
> the impression I can simply do something like:
> 
> # create an OpenGL surface
> surface = pygame.display.set_mode ((width, height), OPENGL | DOUBLEBUF)
> 
> # render stuff onto it
> glBegin ( GL_TRIANGLE )
> glVertex (...)
> glVertex (...)
> glVertex (...)
> glEnd ()
> 
> # update display
> pygame.display.flip()

I don't know if this will work. I did the following sort of thing...

self.image=self.font.render(self.text, 1, self.colour)

...

#make the string. This is a slow bit
tex=pygame.image.tostring(self.image, "RGBA")

#assign a texture
self.texture=glGenTextures(1)
glBindTexture(GL_TEXTURE_2D, self.texture)
glPixelStorei(GL_UNPACK_ALIGNMENT,1)

#build MIPMAP levels. Ths is another slow bit            
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGBA, self.width, self.height, GL_RGBA, GL_UNSIGNED_BYTE, tex)
                
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)  
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR)        


I presume (because I really don't know for sure) that the ideal way to do it is to create a hardware surface, render to it using SDL and then 'just use' that surface as a texture in openGL (because its already in the GFX hardware). Do SDL and OpenGL play nicely together in Hardware? If the SDL primitives are reduced to OpenGL primatives (inside of SDL) then yes. 

I don't know how the internals work. If this is possible, this is by far the best way of combining the two.

Anyone know for sure?

Crispin

____________________________________
pygame mailing list
pygame-users@seul.org
http://pygame.seul.org