Find the answer to your Linux question:
Results 1 to 3 of 3
One thing you've probably noticed when using products from proprietary companies is that a lot of times they distribute their program as a binary file (usually with the .bin suffix) ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Feb 2008
    Posts
    72

    One-binary-fits-all vs individual package management systems


    One thing you've probably noticed when using products from proprietary companies is that a lot of times they distribute their program as a binary file (usually with the .bin suffix) that can be run on any distribution. An example would be Adobe's AIR or Linux clients of commercial games (like those provided by id software). I have to say, I absolutely LOVE when companies provide these types of installations instead of packages for each distro. It comes with all the libraries and other dependencies you need (a godsend for boxes without Internet access), and many times they stick everything in /opt (or some other user-specified directory) so you know where it's all at.
    However, this method of distribution goes against the FHS, which specifies separate locations for all binaries, libraries, configuration files, etc. At some level, I still like this system. If I'm looking for a configuration file, I know to check /etc. If I want a library, I'll poke around in [/usr]/lib (although libraries are still scattered around in a seemingly arbitrary way, unless there is a standard and I'm just not aware of it).

    So which do you prefer? If the day ever comes (or has come) when you make a large, popular Linux application, what will you do? Take control of everything yourself and provide a one-binary-fits-all solution? Or stick with current standards and provide packages for each distribution? Or maybe just offer the source code and make users compile it themselves?

    Standards are nice and all, but it wouldn't be so bad if everything was universal. Would it?

  2. #2
    Linux Guru bigtomrodney's Avatar
    Join Date
    Nov 2004
    Location
    Ireland
    Posts
    6,133
    I definitely prefer package management. I'll use a .run script where I have to. A lot of new users complain about the amount of package managers and are usually silenced by the existing users but I do think it's time we tried to standardise. To be fair they all just implement each other's changes and none are really providing a clear advantage.

    The usual dependency argument that's peddled is more a case of the relevant dependencies not being packaged or differences in naming conventions. Realistically I feel that with the LSB standardising more elements of naming conventions of freedesktop.org helping with other aspects it's time we decided on one format. If that was the case then I would expect all proprietary software to be distributed using this method. I think .run would be out the window then...I really only tolerate it to be honest.

  3. #3
    Linux Engineer GNU-Fan's Avatar
    Join Date
    Mar 2008
    Posts
    935
    I think it is a good think to let Upstream concentrate on the actual application and the distribution maintainers to worry about the compilation & install procedure.
    Not only because of the file system but also in the sake of machine dependent optimizations. (There is more than i386 & x64)

    If you look at the sites of the package maintainers, you will see that they all do distro specific patching anyway.

    That said, also Upstream's tarball should have a proper configure&make scheme in case an individual wants to have the newest version that didn't make it into a package yet.
    Debian GNU/Linux -- You know you want it.

  4. $spacer_open
    $spacer_close

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •