[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Of time scaling and FPS



Quoting "Miguel A. Osorio" <maos@gbl.com.br>:
> 	Anyway, on to the FPS problem. Using the method described above, I
> use
> delta to scale my time-dependent variables, in order to get some
> smooth
> animation. Thing is, I don't. The whole animation looks all jagged up,
> without using time scaling it goes on fine; do note that, however, I
> don't know why, the FPS mark keeps jumping about *all the time*, it
> never settles on some average. Does anyone know a reason for this? Or
> am
> I doing the whole time scaling calculation the wrong way?

Yes. Gettime(2) returns the current time in SECONDS. You'd be wanting to 
retrieve it by calling gettimeofday(2) which fills in a structure which 
contains seconds and microseconds since 00:00 Jan 1 1970.

You can then do the subtraction, do the carry between the seconds field and 
microseconds field[1] and then you have a microseconds difference since the 
last frame to play with as you will. It's not 100% accurate (what is) but I 
can't recall what precision you can expect. I've certainly not had any issues 
in treating it as milli-second accurate.


[1] You could just convert the numbers to float[2] and subtract then, BUT the 
numbers as they stand are too big to convert to float without loosing precision.

[2] Because OpenGL runs off floats, so does the rest of my code to prevent 
float<->double conversions.
_______________________________________________________________________________
      Katie Lauren Lucas, Consultant Software Engineer, Parasol Solutions
katie@fysh.org katie.lucas@parasolsolutions.com http://www.parasolsolutions.com