
Here's what I really like about Gentoo:
* It's developer friendly by design. All packets always come with headers and full documentation. You can build binary packages for other distributions with it if you know a bit what you're doing.
* It does not make a fuzz about the licenses of the stuff it has official ebuilds for. It's easy to install vmware, Sun Java and the evil binary NVidia drivers, the proprietary codecs for mplayer, even Doom3 (without the data files, of course).
* The popular ebuilds are always up to date. I know of no other distribution where you can get new stuff, fully integrated into the system, so fast and easily.
* The USE flags allow me to use the same basic distribution on my server, the digital video recorder and the two "workstations". They share some scripts easily, I made it so that packages are downloaded only once even if they're installed on multiple PCs, all the goodness of a homogeneous network.
* I get to pick on a per-package basis whether I want a stable or a testing version.
Stuff I don't care too much about is the long installation time of some packages and the tiny optimizations you may get by fine tuning compilation flags; that's mostly a placebo effect anyway.
What drives me mad, on the other hand:
* The upgrade policy. It's "Install the new version, remove the old version, tell the user what he has to do manually to fix what the removal of the old version broke." Not only do you have to manually run this "revdep-rebuild" tool that detects broken references to old dynamic libraries, you also have to run some libtool replacement thing that I always forget about until it bites me again and I have to find out by googling the error message. That would be OK if you just had to run these tools without parameters, no problem with that. But that libtool repair tool requires you to give the full name of the OLD library that, great, just has been deinstalled and by the time you notice you have no way of finding out what it was called. revdep-rebuild runs through only in very rare cases; you have to manually intervene and reinstall the packages that broke by hand. Topping that, you get told you have to have to run this-and-that repair tool (I haven't mentioned rebuilding Python object code files if you install a new version of Python and other goodies), you get told to do so, if you're lucky, in some small post-install message that usually gets lost in the build log of the next packet (yeah, they're also in a special file somewhere without all the clutter, but who reads that?)
* Dependency gridlock. I stopped to count how many times I have version X of A installed and want to upgrade to version Y, but version Y depends on version Z of B, and version X of A blocks Z of B, so I manually have to uninstall A and reinstall it to resolve the lock.
* Manual configuration file update. After installing new packets, there's a neat-in-theory tool (in fact, two of them) to update the files in /etc for the new version. Problem: it doesn't work nearly as good as, say, "svn merge". I often need to approve changes to files I never even touched, and I need to manually merge my changes with upstream changes piece by piece for every file I did edit, even when only the header was adapted to the new version. Often, programs will not run without this upgrade process.
* No "security only" updates. Security updates can be automated in theory, but the security list is basically a list of packets and versions affected by security problems together with build instructions for a fix. The problem is that as a fix, you always get the most up to date version. So all the goodness of the previous rant points is in full effect for security updates, with the result that they can break your system, with the result that you can't have them applied automatically in a reliable way.
* Old versions of packages disappear really fast from the portage tree. That means if your last upgrade is a few months old, you can't upgrade to a new version and be sure you can safely revert to the last known working version when things go bad. This is especially bad for laptops if you want power management to work completely. The proper working of my laptop's WLAN interface depends on the kernel, ndiswrapper, the windows driver and the OS driver (to upload the firmware). Any breakage in any of these components, and I'm cable bound (and was for the past two years until I noticed the firmware trick with the OS driver, before it worked fine with just ndiswrapper).
* not-so-popular software is not up to date. For that, you get better support with a really popular binary based distribution that the software distributor itself is likely to support directly.
With all that, the up-to-dateness of the ebuilds is a moot point. What use is it if you don't upgrade at all, knowing from experience that chances are high the upgrade will break stuff and you'll spend hours fixing it (after the upgrade itself already taking some hours)?
Yeah, that was my rant. As I said, I'm looking into alternatives. Debian based systems, with external repositories to work around the Free Software only restriction, look most promising; I've had the least trouble with Debian and Ubuntu when testing them in vmware for Armagetron autopackge compatibility. I'm a bit turned off by Ubuntu; we have it here at work and most development manpages plainly don't exist (is this just a setup problem? Are there extra packages for manpages?), the installed version of Firefox really likes to crash a lot, and building RPM packages fails for very mysterious reasons (it can't find the source tarball even if it plainly exists where the error message says it's looking for it). Are there good distributions out there for lazy developers? I mainly want a non-breaking system with easy access to tons of reasonably up to date software with headers and documentation and I don't want to do too much system maintenance work.