[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [pygame] [Pygame] How can I have an accurate timing when I am showing an image?



On Fri, May 23, 2008 at 9:15 PM, Noah Kantrowitz <kantrn@xxxxxxx> wrote:

Anywhere you need that kind of timing accuracy, python is probably not the right tool for the job. I would recommend using plain SDL in C, you will likely find it much more stable timing-wise.

You may be right that python is a poor tool in cases where you need timing accuracy for stuff where you execute a lot of python code, but I really doubt python would pose any limitation for just showing an image for a specific amount of time. Task switching, timer resolution and video refresh all represent larger chunks of time than python code execution, and also the approach to doing the timing matters a lot more than whether python is executing it or c is for most tasks.


Francesco,
   the code you send seems wrong to me - if you want to measure how long each image is visible, you should be timing from the display.update that presented the image, to the display.update that presented something other than the image?

in other words, shouldn't it be more like this:
       draw_the_image(screen)
       magic_timer.start_timer()
       display.update()
       draw_the_next_thing(screen)
       magic_timer.wait_for_time_from_start(TIME_TEST_IMAGE)
       display.update()


...also, it seems that the approach Charlie suggested for timing (record the start time, do some work, than wait until you reach a specific time after your start time) is better than just trying to do a fixed delay after doing work, because it lets you adjust the delay based on how long the work took, so it doesn't throw you off.

Attached is source for a little class that hopefully will help in using that approach?


import time
import sys

class AccurateWait():    
    def __init__(self):
        if sys.platform.startswith("win"):
            self.high_res_timer = time.clock
            self.sleep_buffer = .05
        elif sys.platform == "darwin" or sys.platform.startswith("linux"):
            self.high_res_timer = time.time
            self.sleep_buffer = .02
        else:
            raise Exception("Don't know the best timer for your platform")
    
    def start_timer(self):
        self.start_time = self.high_res_timer()
        
    def wait_for_time_from_start(self, seconds):
        target = self.start_time + seconds
        time_now = self.high_res_timer()
        while time_now < target:
            # why bother sleeping? to avoid getting pre-empted cause you hog cpu
            if time_now - target > self.sleep_buffer:
                time.sleep(time_now - target - self.sleep_buffer)
            time_now = self.high_res_timer()
        return time_now - self.start_time
    
    def get_time_from_start(self):
        return self.high_res_timer() - self.start_time

if __name__ == "__main__":
    # test it out...
    waiter = AccurateWait()
    
    # wait a bit to get out the kinks...
    time.sleep(1)
    
    waiter.start_timer()
    print waiter.wait_for_time_from_start(.345)
    print waiter.wait_for_time_from_start(.352)
    print waiter.wait_for_time_from_start(.5)
    
    waiter.start_timer()
    print waiter.wait_for_time_from_start(.11)
    print waiter.wait_for_time_from_start(.2)
    print waiter.wait_for_time_from_start(.3)
    print waiter.get_time_from_start()