[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Steve Baker wrote:
> Lastly, there is a 'feedback' effect here.
> The whole 'RISC' concept is founded on that premise (although 'RISC'
> has never made it into PC's - it's what 'real' computers use).
These days it's even worse than this. Instructions are being chopped
up in microcode, which gets executed on whatever subunit is available.
This goes much further than simple n-stage pipelines, or multiple
pipelines (U/V thing). Here you get a system where the op decoder
creates these microops until the buffer is full, the execution unit
takes microops in *any* order out of this buffer and runs them as
soon as the necessary data is available (from mem or from previous
out of order micro-ops executed). The resulting data is stored into a
temporary buffer until the micro-ops that originally preceded
what was just computed are done as well, to make sure the final
results are written back to memory in the right order, "right"
begin "what you'd get if we *did* execute the asm code linearly".
Compared to this the old skool trick of parallelling integer math
with an fdiv is kindergarten level :)
If it sounds complicated, that's because it is :) You cannot
even know in what order your assembly code will be executed..
It sure convinced me, former asm freak from X-Mode up to MMX, to
stay away from asm as long as possible. These CPUs are so deep
that even the compiler has trouble finding the best code for it,
no way we'll beat that... I assume that specialists, in specialized
areas like device drivers, could still take good use of asm..
but for general purpose game/engine writing.. I dunno..
-=<Short Controlled Bursts>=-