[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]
Re: gEDA-user: Free Dog meetings at MIT starting this September!
On Saturday 21 August 2004 08:42 am, Karel Kulhavý wrote:
> I don't consider it a threat, but consider it to suck. Opinions
> of others?
Closed hardware is itself as much a threat as closed software is. The
industrial revolution took off with the introduction of standardized
parts. Prior to that point, all components for machines were custom
made, every single time, and you could get them only through ONE vendor,
who was free to charge whatever they liked for them. If I remember
rightly, this was actually a rather significant problem for the American
Civil War, where components for one soldier's gun would not fit on
another soldier's gun (if it wasn't the Civil War, then it most
definitely was the case for the Independence War). This was true for
Back in the hey-day of computers, each component from a vendor actually
came with full schematics. Indeed, everywhere you look in computer
history, you find people taking their PDP-8s or IBM 7054s or whatever,
and implementing some new feature that was officially unsupported by the
original vendors. Case in point: Unix was originally developed on a
PDP-7. But this was not an unadorned PDP-7. To support more efficient
swapping of processes to and from core, they bolted on a KS-10 (which
was never intended for the PDP-7, and certainly not officially supported
by the folks at DEC). For the longest time, it wouldn't work.
Thankfully, due to the existance of full schematics, they tracked the
problem down to a missing inverter chip.
I challenge anyone to do something like this with currently available
The distribution model for open hardware need not be the same as that for
open source software. Nobody, I think, is asking that to happen.
Hardware requires tangibles to manufacture, and labor to assemble and
ship. These resources must be paid for. But that doesn't mean that the
guts of the product should be wrapped up so tightly that even God can't
see what's inside.
Remember the old 8-bit computer magazines? It seems like a month didn't
go by when someone didn't have some kind of new and exciting
hardware-level hack for their computer, that made it work better, run
faster, address more memory, or display more colors, etc. They could do
these things because schematics were generally available for their
computers, and hardware-level information for any of their custom chips
were easily available as well.
Today, homebrew hacking has been reduced to a hobby so marginalized that
most computer users don't even know it's happening, and indeed, MOST
hobbiests don't even know who else is participating as well! In the ham
radio community, for example, most computer-interfaced projects I've
seen work via the parallel port, or via the serial port -- that is it.
These are ports that have a very limited lifespan in modern PC
architectures. USB is the next "entry-level" port that is available to
use, and the barrier to entry in using that is immense. I cannot think
of a single ham radio, home-brew project I've seen documented in any ham
radio publication that even once employs the USB interface. I do not
believe this to be a coincidence -- while the availability of 8051s and
other (relatively) inexpensive microcontrollers exist that can interface
directly to the USB hardware layer, the *software* must be insanely
complex (if not inside the microcontroller, then certainly on the host
computer OS side of things!).
I believe that open hardware can firmly mitigate every one of the
aforementioned problems. While it won't solve everything, it will at
least let people make more informed choices, and choose technology based
on a solid analysis of needs-vs-wants, instead of whatever is the latest
fad in interfacing. For example: why is there a need for USB at all?
Why not just take IEEE-488, serialize it (which most definitely has been
done before, by both HP and Commodore, to name just two. And HP's
solution was fully auto-configuring too!), and make the PHY layer faster
to accomodate more devices? Why go through the whole process of
designing a WHOLE new wire-level protocol? That's a LOT of money
wasted, JUST to somehow "be different." Such a difference allows them
to more easily control who has access to the SIG, and how much they PAY
to be a member, to get unique device IDs and such. Give me a break.
The computer industry lasted 30+ years before such things were needed.
Criminey, Amiga's Zorro bus was also fully auto-config (in fact, that's
what distinguished it from its competitors at the time; even Macintoshes
didn't have expansion buses then!), and Commodore just gave away company
IDs (and device IDs were freely chosen by the company within the context
of a company ID). The only thing they did was maintain a registry. No
need for SIGs here.
Anyway, I'm going to get off my soap-box now. These are my opinions, as
Samuel A. Falvo II