[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [pygame] Time-based interpolation :: jitter problem



It's a big topic, at best you want to be drawing at the refresh rate
of the screen... otherwise you want to aim and get a consistent
refresh rate... rather than draw as fast as you can.  This is why it
can be better to get 20fps consistently rather than 70fps most of the
time, and 50fps at other times.  It's the non consistent frame rate
that can cause jerkyness too.

pygame.time.delay will give you more accurate timing on some boxes -
at the expense of cpu and power waste.  The idea is you find out how
much time you need to wait until you next tick will start, and then
sleep with that.

However it's probably the inconsistent frame rate that is causing the jerkies.

cu,


On Fri, Jun 19, 2009 at 2:35 PM, Alexandre Quessy<alexandre@xxxxxxxxxx> wrote:
> Hi !
> I am trying to use an interpolation based on time.time() to create
> smooth animations with Pygame and OpenGL.
> The idea is to get time-based motion tween, instead of frame-based.
> It seems like there is still jitter in my interpolation. The motion is
> not uniform.
> Anyone can propose a more accurate way to measure time
> (human-perceived time, not CPU time) than time.time()?
> ...or maybe it is my Tween class code that is wrong. It is based on
> the excellent Robert Penner easing equations. Maybe there is something
> wrong with my Tween.tick(t) method.
> Can you guys take a look ?
>
> You can download my tweening test here :
> http://alexandre.quessy.net/static/pygame/tween_progress.tar.gz
> (this code is part of toonloop.com)
> --
> Alexandre Quessy
> http://alexandre.quessy.net/
>