One-binary-fits-all vs individual package management systems
One thing you've probably noticed when using products from proprietary companies is that a lot of times they distribute their program as a binary file (usually with the .bin suffix) that can be run on any distribution. An example would be Adobe's AIR or Linux clients of commercial games (like those provided by id software). I have to say, I absolutely LOVE when companies provide these types of installations instead of packages for each distro. It comes with all the libraries and other dependencies you need (a godsend for boxes without Internet access), and many times they stick everything in /opt (or some other user-specified directory) so you know where it's all at.
However, this method of distribution goes against the FHS, which specifies separate locations for all binaries, libraries, configuration files, etc. At some level, I still like this system. If I'm looking for a configuration file, I know to check /etc. If I want a library, I'll poke around in [/usr]/lib (although libraries are still scattered around in a seemingly arbitrary way, unless there is a standard and I'm just not aware of it).
So which do you prefer? If the day ever comes (or has come) when you make a large, popular Linux application, what will you do? Take control of everything yourself and provide a one-binary-fits-all solution? Or stick with current standards and provide packages for each distribution? Or maybe just offer the source code and make users compile it themselves?
Standards are nice and all, but it wouldn't be so bad if everything was universal. Would it?