[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: gEDA-user: HELP with Icarus Verilog, 0.7!



From: "Samuel A. Falvo II" <sam.falvo@falvotech.com>
Subject: Re: gEDA-user: HELP with Icarus Verilog, 0.7!
Date: Tue, 6 Apr 2004 23:33:51 -0700
Message-ID: <200404062333.51302.sam.falvo@falvotech.com>

> On Tuesday 06 April 2004 05:15 pm, Mike Jarabek wrote:
> > You are correct, it's not always a bad idea.  I did not qualify my
> > statement adequately.  When you use the clock as part of your logic
> > you automatically double the frequency that your circuit must operate
> > at.  This is probably not a big deal when your clock is under 20MHz,
> > but it certainly affects your design at speeds like 100MHz.  If the
> 
> Oh, well, yeah.  :)  No arguments there.  In fact, I'm surprised that 
> modern microprocessors don't have input-only and output-only buses 
> exposed at the physical pin boundary.  I know it takes a lot of pins to 
> be able to do so, but egads, with 400+ pin BGAs, you'd think it would 
> draw some attention.  I suppose point-to-point buses like HyperTransport 
> offer similar functionality to having split, fully synchronous 
> interfaces though.

For certain applications it is the only thing. Bi-directional busses cost alot
to change direction on unless they go point to point and is accurately timed.
Look at QDR memories and you see what I mean!

> My ultimate goal is to produce an asynchronous, MISC architectre CPU that 
> I can use in a future design, specifically because the reduction in EMI 
> and power consumption more than outweighs its disadvantages.  But that 
> is a very, very long-term project for me, and is not a high priority.  

You don't necessarilly need to run asynchronously to gain a huge reduction in
EMI. You can do small wonders in strict synchronous designs by strategically
lock signals static when the block is unused, and some chip even turn of the
clock for parts of the chip when unused. For instance will some CPUs cut the
clock for the FPU when not in use, which saves power. The cost for closing
down the clock of part of a chip is that when it is to be used, the clock must
be turned on and run for a number of cycles before the block is ready to use,
and during that happends the processing temporarilly halts.

> > Using a clock as a qualifier inside of an FPGA can be dangerous.
> > Especially if you are using a look up table based FPGA, if one of the
> > signals is asynchronous to the clock, glitches can appear on an
> 
> Forgive my naivity, but why is this?  This seems like such a basic thing 
> to be able to do.  I know that many FPGAs have dedicated lines to 
> support a clock, complete with clock skew compensation buffers 
> throughout the chip.  But what if the clock were also routed to a data 
> input, for use in a border interface of some kind?

Timing and timing analysis becomes a hell of a lot more complex. It is much
better to let the clock run and use enable signals instead. The enable signal
will be just another logic signal and do well in synthesis to real FPGA.

> > When you do this in an ASIC, you have to carefully constrain the paths
> > when you attempt timing closure.  It's often hard to communicate this
> 
> Ahh, I should say that I'll be targeting either a CPLD or FPGA for my 
> designs.  No way in hell I can support a custom ASIC (much as I'd love 
> to...).  :)

So, you don't have an infinitely large budget??? Bye! ;O)

Cheers,
Magnus