[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: New package managment
Bryan Patrick Coleman wrote:
> On 25 Sep 1999 12:34:01 +0200, Ingo Ruhnke wrote:
> > Pierre Phaneuf <firstname.lastname@example.org> writes:
> > > He did have a full setup of GNOME, KDE and plenty of candy installed,
> > > just no development tools, so I guess it isn't safe to assume "gcc" or
> > > even "make"!
> > You could be right.
> > Back to topic, we don't need a new package system for binaries, the
> > ones out there are good enough. So for a 'new' package/build system we
> > could safely ignore all those people out there without any development
> > tools and instead focus on the people out there which have a full
> > features linux installation (with gcc, etc.), but which build only a
> > very little number of packages themself (for example just the kernel).
> What I have in mind is a kind of supper package class that can do more (more
> easily) than what is out there. I think I am going to use dpkg & apt as the
> underlying pakaging system. What I am envisioning is "grouping" several deb
> pakages into on superfile that can install or build packages as needed.
> For example lets look at the Pingus situation. You have this supper package
> that contains clanlib and hermes libs for Pingus.
No - that's silly. Now someone who already has Clanlib and Hermes will
have to download all that code all over again.
In my case, I'm writing a new game. It's pretty tiny - just 3,000 lines
C++ code. But the end user needs:
PlumbCrazy (the game)
which needs PLIB
which needs GLUT
which needs Mesa
which might need GLIDE
which needs Zlib
That turns the teeny tiny game (just a few K of download) into a package
of about 10Mb.
1) A *few* people will have none of those libraries installed.
2) Most people will need one or two of them
3) A few people will already have all of them installed.
There is no way that bundling everything into one huge install
is going to address those users.
For people in group (1) it's unlikely that we'd have bundled GLIDE or
Mesa into the tarball. For people in group (2), we *might* guess right
and pick the set of libraries that they need. For people in group (3),
it's all a huge waste of bandwidth.
Not only that, but there is no doubt that we'd include versions of some
libraries that were hopelessly out of date - nothing out there in Linux
land stands still.
A much better scenario is the one I proposed with the (poorly named)
autoweb scheme. People in group (3) pull down the game - it checks,
finds all the stuff it needs and then installs itself. People in
group (2) pull down the game - it pulls in the odd library they are
missing (or which is out of date) and installs them - then the game
itself. People in group (1) have a long wait on their hands - but
c'est la vie.
> When you install the package it sees tha need to install the libs as well
> and does so.
A Good Thing - but why pull down all those binaries if we don't have to?
> I also am going to include source in the packages so that binaries can be
> built from the same package. Even using the option to build staticly linked
> binaries. Basicaly the libs would be installed for the build and then
> removed after the binary is built.
...causing people to have to download the damned things again whenever
grab another game that uses them.
In the late 1990's, disk space is extremely cheap - but for many people
outside the US, bandwidth is incredibly expensive. For many people
the US who don't have cablemodems or whatever, bandwidth is incredibly
> I would like to say something in the defense of static linking. There is
> currently a glut in the lib market with new projects cropping up all the
> time. If each new game comes out and is using a different lib you really are
> not gaining anything. Plus what if you really want to games that use
> different versions of a lib that are incompatable. Say clanlib. Then you
> could have the new clanlib installed, but have the Pingus game do a static
> link with the old lib.
Dynamic linking has that nailed. That's why you'll see ".so" files with
version numbers after them.
I agree with Pierre - we don't need yet another binary package manager.
RPM's and their ilk have that covered.
We don't need a new source manager either, autoconf/automake are really
Anything you did in that area would simply create Yet Another Standard -
which would be A Bad Thing. Doing what you propose would make matters
worse - not better - because the poor luser gets yet another set of
commands to master.
Look - we are *just* getting to the point where RPM and
are reasonably "standard" - most binaries come in RPM form and most
What's needed is the higher level thing that ferrets out all the
of a big package and assembles it from that various RPM's and tarballs.
Steve Baker http://web2.airmail.net/sjbaker1
email@example.com (home) http://www.woodsoup.org/~sbaker