Find the answer to your Linux question:
Results 1 to 6 of 6
Hello, I dont know what is the difference between dynamicaly linking libraries or staticaly linking libraries? Can anyone know which one is best to use and why? Thx...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Jun 2006
    Location
    (.)
    Posts
    69

    dynamic and static libraries


    Hello,

    I dont know what is the difference between dynamicaly linking libraries or staticaly linking libraries?

    Can anyone know which one is best to use and why?

    Thx

  2. #2
    Linux Engineer Zelmo's Avatar
    Join Date
    Jan 2006
    Location
    Riverton, UT, USA
    Posts
    1,001
    Discover Wikipedia.
    Stand up and be counted as a Linux user!

  3. #3
    Just Joined!
    Join Date
    Jun 2006
    Location
    (.)
    Posts
    69
    Would it be possible for you to discuss it thing in this thread as Documentation of any subject can be found from Google... I choose forums to discussion each pro and cons and everybody's views like poll and....

  4. $spacer_open
    $spacer_close
  5. #4
    Linux Engineer Zelmo's Avatar
    Join Date
    Jan 2006
    Location
    Riverton, UT, USA
    Posts
    1,001
    Oh, this is a good candidate for discussion. There have been a few different approaches to linking employed by different OSes, and I think Linux apps have taken every approach possible.

    One approach is to have everything dynamically linked--that is, have all of your libraries in one standardized location, and installed programs will look to that location for the libraries they need. Linux traditionally uses this approach, and it's what often led to "dependency hell" for distros that don't use apt-get or yum or some other method of resolving dependencies automatically. But it keeps the system lean, because you only install the libraries you need, and there are no duplicate libraries in other locations.

    Another approach is to package a program with any libraries it needs, and keep those libraries in a directory local to the installed program (static linking). This ensures that the program has all the libraries it needs, and that they are the correct (or at least working) versions, at the cost of having the same libraries installed multiple times for different programs. This takes up extra disk space. But again, on the plus side, package maintainers don't need to worry about how to ensure that the system has the needed libraries, so programs are easier to develop and deliver. PC-BSD does this with its PBI package management system (but still supports traditional ports and pkg_add, which use dynamic linking).

    What MS Windows does is use the best of both worlds. They package all the needed libraries with their programs, and the installer checks to see if the newer versions of the libraries already exist on the system. If not, they get installed to a standardized location, and the programs are dynamically linked. So Windows always has the latest version of every library it needs, and no more. The only drawback there is that the download size for each program is bigger that it really has to be, because they will always include all of the libraries they need (except for the libraries that are included with the base OS). That's where systems that use apt-get or yum have an edge, because they too will only keep the latest versions of just the libraries they need, but they do it by downloading the libraries separately, which allows the program packages to be smaller.

    Other than packaging, there's one danger in using dynamic linking compared to static, and one small annoyance. The danger is that if a library gets removed because the package manager or the system administrator mistakenly thinks it's not needed any more, programs that still need it will cease to work propery (if at all). The small annoyance is that if programs are removed or upgraded such that certain libraries are no longer needed, their non-use may go unnoticed and they take up disk space unnecessarily. Again, Debian's deborphan program is good about catching unused libraries, so Debian-based systems are relatively strong on all discussed points.
    Stand up and be counted as a Linux user!

  6. #5
    Just Joined!
    Join Date
    Nov 2007
    Posts
    1

    dynamic and static libaries

    This is all fine and good, but the real question is not whether anyone should be silly enough to use a dynamic link, but rather, how does one use fedora and get static libraries. I always want to compile with static libraries but am always having trouble because for some reason, fedora doesn't seem to come with *.a libraries, which are needed for compiling with a -static option. In particular, I'm using Fedora 7, and I spent a lot of time setting it up to designed for programming. However, there a almost no *.a files. Examples: libXmu.a and libX11.a. Even after running yum install libXmu-devel, it only installs the practically worthless *.so files. How could anyone ever compile statically with those? What's the point? Am I seriously having to spend entire days of my life looking for ways to get *.a files because someone else wanted to save a few kilobytes of disk space? Seriously, does anyone know a simple way to just automatically get a *.a version of every *.so file on your system? I consider it a bug that fedora doesn't do this automatically.

    Thanks in advance for your help.

    Sincerely,

    Sean =)


    Quote Originally Posted by Zelmo View Post
    Oh, this is a good candidate for discussion. There have been a few different approaches to linking employed by different OSes, and I think Linux apps have taken every approach possible.

    One approach is to have everything dynamically linked--that is, have all of your libraries in one standardized location, and installed programs will look to that location for the libraries they need. Linux traditionally uses this approach, and it's what often led to "dependency hell" for distros that don't use apt-get or yum or some other method of resolving dependencies automatically. But it keeps the system lean, because you only install the libraries you need, and there are no duplicate libraries in other locations.

    Another approach is to package a program with any libraries it needs, and keep those libraries in a directory local to the installed program (static linking). This ensures that the program has all the libraries it needs, and that they are the correct (or at least working) versions, at the cost of having the same libraries installed multiple times for different programs. This takes up extra disk space. But again, on the plus side, package maintainers don't need to worry about how to ensure that the system has the needed libraries, so programs are easier to develop and deliver. PC-BSD does this with its PBI package management system (but still supports traditional ports and pkg_add, which use dynamic linking).

    What MS Windows does is use the best of both worlds. They package all the needed libraries with their programs, and the installer checks to see if the newer versions of the libraries already exist on the system. If not, they get installed to a standardized location, and the programs are dynamically linked. So Windows always has the latest version of every library it needs, and no more. The only drawback there is that the download size for each program is bigger that it really has to be, because they will always include all of the libraries they need (except for the libraries that are included with the base OS). That's where systems that use apt-get or yum have an edge, because they too will only keep the latest versions of just the libraries they need, but they do it by downloading the libraries separately, which allows the program packages to be smaller.

    Other than packaging, there's one danger in using dynamic linking compared to static, and one small annoyance. The danger is that if a library gets removed because the package manager or the system administrator mistakenly thinks it's not needed any more, programs that still need it will cease to work propery (if at all). The small annoyance is that if programs are removed or upgraded such that certain libraries are no longer needed, their non-use may go unnoticed and they take up disk space unnecessarily. Again, Debian's deborphan program is good about catching unused libraries, so Debian-based systems are relatively strong on all discussed points.

  7. #6
    Just Joined!
    Join Date
    Oct 2007
    Posts
    15
    Quote Originally Posted by sean.mcguffee@gmail.com View Post
    This is all fine and good, but the real question is not whether anyone should be silly enough to use a dynamic link, but rather, how does one use fedora and get static libraries. I always want to compile with static libraries but am always having trouble because for some reason, fedora doesn't seem to come with *.a libraries, which are needed for compiling with a -static option.
    You can use Emine (http://magicErmine.com) or statifier (http://statifier.sf.net)
    to convert your dynamically linked executable to "pseudo-static".


    Valery.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •