[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Of time scaling and FPS
Quoting "Miguel A. Osorio" <firstname.lastname@example.org>:
> Anyway, on to the FPS problem. Using the method described above, I
> delta to scale my time-dependent variables, in order to get some
> animation. Thing is, I don't. The whole animation looks all jagged up,
> without using time scaling it goes on fine; do note that, however, I
> don't know why, the FPS mark keeps jumping about *all the time*, it
> never settles on some average. Does anyone know a reason for this? Or
> I doing the whole time scaling calculation the wrong way?
Yes. Gettime(2) returns the current time in SECONDS. You'd be wanting to
retrieve it by calling gettimeofday(2) which fills in a structure which
contains seconds and microseconds since 00:00 Jan 1 1970.
You can then do the subtraction, do the carry between the seconds field and
microseconds field and then you have a microseconds difference since the
last frame to play with as you will. It's not 100% accurate (what is) but I
can't recall what precision you can expect. I've certainly not had any issues
in treating it as milli-second accurate.
 You could just convert the numbers to float and subtract then, BUT the
numbers as they stand are too big to convert to float without loosing precision.
 Because OpenGL runs off floats, so does the rest of my code to prevent
Katie Lauren Lucas, Consultant Software Engineer, Parasol Solutions
email@example.com firstname.lastname@example.org http://www.parasolsolutions.com