[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: gEDA-user: Reviews




On Dec 17, 2005, at 1:53 PM, David Hart wrote:

If you work for a large company, what review processes do your designs
go through? I'm interested in peer-review and formal-review processes
for schematic designs and board layouts.

Well, I can tell you a little about NASA and ISAS (Japan).


For these design reviews, what defines acceptance and/or approval?

The usual criterion is that all issues raised by the reviewers are closed, either by design changes or by demonstration that change is not required. Usually the reviewers and designers agree, issue closed, but sometimes they don't. Then, there's generally a manager at whose desk the buck stops (titles may vary).


In the NASA system, that manager is generally an institutional politician without deep technical expertise. Usually he'll consult experts and make a decision based on their recommendations. The trouble is that within NASA you can almost always find an "expert" who will assert (with great confidence) any opinion you want on any issue. So the manager, being a politician, chooses the expert who best supports whatever political agenda he has. Thus, the same sort of politicized process that led to conspicuously bad operational decision making in the two shuttle disasters also operates in the design review process.

In contrast, ISAS demands that a mission manager be both a world- class scientist and a world-class systems engineer. Unsurprisingly, that vastly improves the quality of technical decision making by management. Of course such people don't appear out of nowhere: ISAS has to encourage people to take this career path. This improves technical decision making throughout the organization, since it creates a culture in which people at all levels continue to study and upgrade their skills. Nice side effect.

Unfortunately, a NASA manager who actually studies the stuff he's managing will find himself in deep trouble. Other managers will gossip about what a fool he is (I've seen it happen), and he'll find they are right: his authority will be undermined, his career stalled.


When do customers get involved in the review process? If ever? This question is more for the contract/consulting engineer, but may also apply to the corporate types also.

Well, it depends hugely on the customer. Good design review is more difficult than design: once you're behind the interfaces you're into somebody else's style and methodology. You have to throw away your ego and prejudices to really do it right. Does the customer have the capability to do proper review? Egoless engineers are a treasure, and are therefore probably overcommitted. The customers from hell are the ones that *think* they can do review but really can't (e.g. NASA centers).


On the other hand, right now I'm working on a mixed-signal CMOS chip design for a university lab, and they want monthly reports including schematics and netlists. They have two students who study everything, run additional simulations, and ask a lot of questions. Part of their charter is to learn, so they *must* get into my approach. From my perspective it's a very good incremental review process: two bright young guys combing through the details without prejudice.


I used to work in the DoD engineering environment years ago, and was
talking to friends still in the business. I was shocked to learn that
the old standards for design review have been significantly relaxed, and
in many cases are now as informal as "good engineering judgment".

I don't know about DoD, but in NASA the review process is completely broken. I've *never* seen a NASA PDR or CDR committee correctly identify a problem: they distract you with bunches of non-problems and miss the real ones. When I was at MIT, internal reviews were a bit better, but hardly comprehensive. Yes, it's scary to be a designer without a working safety net.


ISAS is a bit better: reviews are informal but more comprehensive than NASA's. ISAS is a big believer in early prototyping (while penny- wise pound-foolish NASA tries to save money by skimping on prototypes). When you deliver the protomodel to Sagamihara for testing, you deliver documentation as well, and experts study it (you can tell they are experts by the quality of questions that come back). It's incremental: there's no scheduled event like PDR or CDR. When the questions and answers stop flowing, you're done.

Unfortunately, since ISAS merged with NASDA to form JAXA, the institutional politicians have been incrementally gaining authority. Not good for the future.

I'll
never get to the moon with this attitude!!!

The next human you see on the moon will be Chinese.


And, for the hobbiest and/or student, what is a good review process?

Well, beggars can't be choosers. There's really no point in worrying about a formal process: the challenge is finding someone who is capable and willing. Offering to a peer "I'll review your project if you'll review mine" is likely to be very educational and almost certainly more effective than anything an aerospace review committee could do.


Getting back to the professional situation, I'll also comment that you shouldn't overestimate what a formal review process can accomplish. Even the best is very leaky. And despite the worse than useless NASA formal review process most designers of space hardware for NASA missions manage to succeed. There are plenty of other effective ways to catch errors.

One that's extremely effective (and not obvious) is to pair up application experts with engineers. If possible, let the application expert do the design, and save the engineer for review, test, and polishing. This works because the application expert probably understands the requirements better, especially the ones that seem so obvious (from the user's point of view) that nobody thinks to write them down. The difficulty is that the engineer winds up doing the more difficult but less glamorous part of the job: I repeat that an egoless engineer is a treasure.

Simulation is very helpful, and that's an area where capabilities have improved dramatically.

Testing is extremely important. Don't underestimate the importance of *informal* testing: noodling around with the system just trying crazy things. Very scary with a $10 million piece of hardware, but remember that a tired spacecraft operator is likely to do almost anything when something unexpected happens at 3 AM. Of course this is a good reason to have an extra prototype to play with.

All these precautions are leaky: the hope is that you can find a watertight combination. Unfortunately, there are no guarantees.

John Doty              Noqsi Aerospace, Ltd.
jpd@xxxxxxxxxxxxx