[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Scene antialiasing



Hey people,



	I'm having a strange problem doing scene antialiasing using OpenGL
(with SDL). Thing is, my code works just fine without the antialiasing
code. Then, I add the code to use the accumulation buffer, something
like:

// Main loop
while(true) {

  // Get input //
  ...

  // Update scene //
  ...

  // Render scene
  for(i = 0; i < samples; ++i) {

    // Clear depth buffer (my skybox "clears" the color buffer)
    ...

    // Jitter projection matrix
    jitter_projection();

    if(!i)
      glAccum(GL_LOAD, 1.0 / samples);

    else
      glAccum(GL_ACCUM, 1.0 / samples);

    // Render stuff //
    ...

    }

  glAccum(GL_RETURN, 1.0);
  SDL_GL_SwapBuffers();

  }

	The problem I'm having is that my frame rate drops from an average of
100 FPS to *1.2* FPS! This problem arises both on Linux and Windows2000,
and I tested this on my computer, using an ATI Rage Fury PRO 128, and on
a friend's machine, using an GeForce4 MX400. Any clues on this one?




Thanks for your attention,
Miguel A. Osorio.