[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Steve Baker wrote:
> ...yes - and after all, a non-realtime "traditional" compiler can just optimise
> *ALL* of the code that well - regardless of whether it's used frequently or not.
> The reason HotSpot is a good idea for an interpreter is that for infrequently
> executed code, you can't optimise the code as well as you'd like to because
> it takes longer than the time you'd save. Hence, recognising the hot spots
> allows you to come *closer* to the performance of a conventional compiler.
> I think this is a ridiculous claim - and if it were remotely likely to be
> true then that's how C++ compilers would work. You'd compile the program
> offline - optimise it the best you possibly could with static knowledge of
> the nature of the program - then keep a copy of the code and re-optimise
> it based on this nebulous "additional information" that somehow becomes
You're focusing too much on loop unrolling for which it is true that a 'static'
compiler can often do a better optimizing job easier. But HotSpot can in
theory (don't think it already does that at this moment) do much more than that.
It can do statistical analyzes on parameters to functions and base optimizations
on that. For example my switch example where statistical analyze showed
that the parameter given to switch is between 1 and 10 for 90% of the cases
(just an example). If a run time environment can detect that then it can
optimize the switch code to be most optimal for values between 1 and 10.
A static compiler can never do that because it doesn't have that statistical
Jorrit.Tyberghein@uz.kuleuven.ac.be, University Hospitals KU Leuven BELGIUM
"If you put butter and salt on it, it tastes like salty butter."
-- Popcorn comes to the Discworld
(Terry Pratchett, Moving Pictures)