Find the answer to your Linux question:
Results 1 to 5 of 5
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1

    Shared lib folders

    Hello everyone, I'm an Italian guy with a passion for programming (and linux of course)

    I like to try new distros on my pc but when I do so, I have to reinstall all the libraries (often recompiling them by hand) and having the same library in two different places is not memory efficient (I was able to have more than 5GB in my lib folders)

    I would like to have a couple of folders (lib32 and lib64) on a different partition so that i can install libraries there and let every distro I install use them.

    Before I get technical: is this even possible (and why)?

    I made some experiments using libraries from a different distro on a different filesystem and it worked but is it possible with every other distro?

    I thought different distros would probably have their own optimizations when compiling the same library. Is it correct?

    Thanks in advance for your help.

  2. #2
    Blackfooted Penguin daark.child's Avatar
    Join Date
    Apr 2006
    West Yorks
    Hi and welcome to the forum. I think your idea will be difficult to implement because different distributions do not necessarily use the same version of a library and if a library is compiled against one distribution, there is no guarantee that it will work correctly with another.

  3. #3
    Thank you for your answer.

    Is it possible to sum up these differences?
    I mean, would it be possible to recompile libraries with no distro-related optimizations so that I can use it on every distribution?

  4. $spacer_open
  5. #4
    Trusted Penguin Irithori's Avatar
    Join Date
    May 2009
    Using the same lib on every distro is not the goal.
    There are a lot of distributions for various usecases.

    For example fedora is a testbed for redhat and usually has quite new software.
    Gentoo and other "rolling distros" compile their software according to their package manager configs.

    On the other end RedHat/CentOS offer a stable platform.
    The tools/libs as well as their version and features are chosen and then frozen.
    There will be no version/feature upgrade during the lifecycle of a major version.
    Which is valueable for mass deployments and SLAs.

    So currently it works (very roughly) this way:
    - A tool/lib is made available via sourceforge/github/freshmeat/etc as sourcecode.
    - Package maintainers evaluate it
    - They chose a version, probably patch it and then compile and package it into RPMs or DEBs
    - The RPMs and DEBs are copied to repositories
    - From now on the packages are installable for the users

    In your usecase..
    Imho, as long as you actively develop these libs use a VM and create a makefile to quickly compile and install it.
    Once it stabilizes or you want to distribute it in a controlled manner to multiple machines:
    Building packages is clean way to go.
    Last edited by Irithori; 12-07-2012 at 08:39 PM. Reason: typo
    You must always face the curtain with a bow.

  6. #5
    Thanks a lot fot all those informations.

    Programming it' s just my hobby, not my job (at least not yet)
    I'm not developing libraries, just using them to develop software (and just for myself).

    Anyway, your idea to use a virtual machine is a very nice solution to my problem: like this I can download the libraries I need only once and keep the system image on a separate partition to be visible by any other system installed on my pc.

    I'll mark this thread as solved, thanks again for your help.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts