On 06/22/2009 10:14 AM, Dan Williams wrote:
It's also a question of maintainability. Sure, we could split up
tons
of packages and add code to all the tools to check runtime-availability
of every tool they might use. But that's just insane, and increases the
maintenance burden tremendously.
This is roughly what Gentoo does, right? Of course, Gentoo has the
'luxury' of re-compiling. But that just gets at, I think, that vanilla
c isn't flexible enough to handle this dynamically. A Python app could
do it pretty easily, IIRC. In that case, a Python implementation of a
thing could conceivably compete for mindshare against the c version,
given the inherent trade-offs.
One could imagine Feature: and Feature-Requires: tags in a spec that
could be used to generate more complex dependency trees and
automatically generate the proper set of package-foo.rpm files.
Integrating this with yum and/or graphical package managers would
certainly be a ton of work.
But to get to the thematic question, probably nobody (for large values
of nobody) cares if any given package has a 40KB dependency. It's when
you have a thousand packages that have a thousand unneeded dependencies,
you increase the cost (time, disk, memory, cpu, bandwidth, electricity,
complexity) to install, update, etc. and you wind up excluding very
small computing devices in some cases.
I agree that making humans manage this would approach insanity. But
does that necessarily preclude allowing computers to handle it?
-Bill
--
Bill McGonigle, Owner Work: 603.448.4440
BFC Computing, LLC Home: 603.448.1668
http://www.bfccomputing.com/ Cell: 603.252.2606
Twitter, etc.: bill_mcgonigle Page: 603.442.1833
Email, IM, VOIP: bill(a)bfccomputing.com
Blog:
http://blog.bfccomputing.com/
VCard:
http://bfccomputing.com/vcard/bill.vcf