[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [pygame] Re: 100% CPU



On 4/23/07, Dave LeCompte (really) <dave@xxxxxxxxxxxx> wrote:
> Why you would WANT to max out the CPU really baffles me.

Grampa used to chastise me for only using 50% of the length of the
handsaw. He said "I paid for every tooth on the saw blade, use them all."
Wasting (including underusing) the CPU seems like the same thing.

That is an adorable analogy. I can really picture the young Dave
LeCompte, looking up over his grandfather's wagging finger as he
learns this important life lesson.

However it is also nowhere near an applicable analogy in this case. So
if you only ever used 20% of the cpu, I guess you'll burn out that 20%
and then have to throw away an 80% good cpu? What a waste. All those
80% good intel CPU's lounging in a trash heap!


Well, sure. If the video rendering takes 100% of the CPU, any game,
regardless of whether it tries to take all CPU cycles available or not,
will suffer.

I get where you are going, but a game that only uses 10% would suffer
much less than a game that uses 100%. likewise the video rendering
will suffer less with the 10% neighbor than the 100% neighbor. It's
easier for the OS scheduler if less is asked for.

You may think I'm being ridiculous - but that is absolutely a real
world case for any real simple games that do busy waiting to control
frame rate, and that was one of the examples Nathan brought up as a
wasteful game.


> are 1) Concurrently running programs are affected minimally,

I really don't see this as a compelling use case. Seems like most people
play games in an essentially single-tasking mode. If they're also trying
to run complex calculations in the background, like you are with your
video rendering, that's just sabotaging the other program.

If you don't see it as a compelling use case, then you don't know
anything about the way users use their computer. Spreadsheets, email
programs, instant messenger, an open web browser with some flash ads
continually running - It's more common to have these going than not on
a given windows box running a game.

And even if the user didn't load and run lots of crap on their box,
you can expect a windows box with nothing on it to use 1% CPU all the
time, and if the game doesn't decide when it's okay to run that, the
OS will.


But I wouldn't jump to the conclusion that just because the whole second's
worth of CPU cycles is used by the end of the second, that anything's been
wasted. Heck, if my bank threw away all the money in my bank account at
the end of the month, I'd be really careful to spend as close as possible
to 100% of my paycheck.

Again, another very compelling yet wonderfully inappropriate analogy.

Even desktop machines now scale energy use based on CPU utilization.
If you use the cycles, you pay more money for them. Granted it's not
likely significant in any way - but I figured it's worth mentioning
cause it shows the economy of cpu use is exactly the opposite of what
your analogy suggests.

...I'm not sure where you got the idea that if you somehow don't use
an available cycle from your cpu it gets lost to the land of wind and
ghosts, but I think it's awesome.