[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [pygame] Path finding demo, now with more fugu



Brian Fisher wrote:

> Also if any game (pygame or not) can't run at it's desired frame rate
> for performance reasons, or doesn't do anything to limit execution to
> a specific frame rate, it will most likely run at 100% cpu all the
> time (it's very common for games to take up all the cpu)

I got a few comments on my recent PyWeek entry mentioning that it ran at
100% CPU. I agree with the sentiment already expressed here - this is
common, and pretty much expected. Indeed, I provide a fullscreen mode, so
that it can take up 100% of the monitor, and I don't see much conceptual
difference between taking full advantage of the pixels available to me and
taking full advantage of the CPU cycles available.

To flip around what Brian said - my desired frame rate is as fast as
possible, so using 100% of the CPU is correct behavior.

One thing to consider - you can have different frame rates, and sometimes
it makes good sense to do so. We just had people talking about the
benefits of input processing happening multiple times per graphical frame,
and (quirks of DirectX aside,) sometimes this is a very good thing - back
in the heyday of first person shooters, devotees insisted that reading the
mouse at 100-200 frames per second was mandatory, even if the graphics
were "only" running at 75 frames per second.

You might also consider running your AI, physics, and any other "game
logic" at some rate separate from (and thus presumably tuned to a
different value than) your graphics and input frame rate.


With all of that in mind, I think it makes complete sense to leave the
graphics frame rate aside when talking about CPU use.


Consider this scenario: a user is playing a strategy game (chess,
perhaps), and the computer isn't able to provide a satisfactory amount of
challenge given the player's patience levels - the player doesn't want to
wait 2 minutes for a move, and the AI is too easy to beat given that.
Well, the user gets a brand new machine that's ten times faster - the user
might well expect that his old machine would play at a certain skill level
with two minute moves and his new machine would play at roughly the same
skill level with 12 second moves (1/10 the time, given a machine that's
10x as fast). If the game was implemented to a hard-coded amount of logic
per second, the new machine would perform just the same as the old
machine, which might disappoint this player. And, if the player had
another game that he also liked playing on his old machine, and it
actually performs 10x better on this new machine, he might well decide to
switch over to playing that other game instead.


-Dave LeCompte