Find the answer to your Linux question:
Page 1 of 2 1 2 LastLast
Results 1 to 10 of 11
My first post; I can't figure out where I should put it - apologies in advance for a misdirected inquiry. To the programmers out there: I've got a question regarding ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined! narnold's Avatar
    Join Date
    Nov 2010
    Posts
    4

    Are compilations created equal?


    My first post; I can't figure out where I should put it - apologies in advance for a misdirected inquiry.

    To the programmers out there:
    I've got a question regarding the compiling of software. Although I took a few courses in college, I am definitely not a programmer and certainly not knowledgeable about the inner workings of computers. I am fairly new to Linux having tried Ubuntu for less than a year now ... but finding that it seems to be a better system - the best descriptor I can give is to analogously call it "sturdier feeling" than Windows - toss in the "free factor" and it becomes pretty compelling. So instead of my Ubuntu installation being a mere stopgap measure I find I'm going to want to use it more. And that likely will mean installing some out-of-the-mainstream software.
    One area that is scary to a new Linux user is the compiling of software. Sooner or later, though, one will want something not available in the software repositories. My first experience with this was the latest iteration of the VLC media player, which, not being in the Ubuntu repository, had to be "compiled from source," as the lingo has it. I found somewhat to my surprise that this wasn't too terribly difficult, but rather extremely tedious, as the "./configure" command had to be run many many times, in between which I got another library file from Synaptic. But at least the error messages were intelligible! I got it done and now have a .deb file as a result.
    My question is: If I compile from a source, and you compile from that same source but use your machine rather than mine, will our resulting .deb files be basically identical? And would this be the same as a Debian package one might download with that same .deb extension? If not, how are those Debian packages different? I ask because it would seem to me that if they are essentially the same file, it would behoove me to save it in case I ever want to install it on another machine, or share with a friend or whatever.
    (FYI for the curious: I was hoping that the newest VLC media player would play VCD's on my laptop. But it won't, so I haven't really gotten what I was looking for in the compilation exercise! Whether that's VLC's fault or my machine's problem, I don't know.)
    Anyway, I'd like either an explanation to this or a suggestion as to where I might find it. There's no need in getting too deep into computerese, as that would be beyond my grasp anyway. And thanks to all who took the time to read this!

  2. #2
    Just Joined!
    Join Date
    Dec 2009
    Location
    Maryland, USA
    Posts
    82
    This is really a 'packaging' question, and LF doesn't have a specific sub-forum for those questions that I can find. I've seen them asked pretty much everywhere but maybe most often in the Programming / Scripting or Applications areas.

    I'm not an expert on packaging, but my observation from a little experience is that, "No, not all packages are equal." The deb package you made is specific to the machine you made it on. A package made for one computer MAY work on another, similarly configured computer. Further, it is possible to build a package that will work on a large number of computers by including a superset of all the files that may be needed.

    A complicating factor is that different families of Linux use different packages, and the different packages have different build utilities.

    I've provided generalities that may only be partially right, but someone smarter will come along and correct my errors and/or give a better answer.

  3. #3
    Administrator MikeTbob's Avatar
    Join Date
    Apr 2006
    Location
    Texas
    Posts
    7,864
    Hello and Welcome,
    I've moved your thread here, it seems to be a better place for it and plus you got one post now. Posts in the coffee lounge don't add to your post count.

    BTW. VLC is in Ubuntu repos.
    I do not respond to private messages asking for Linux help, Please keep it on the forums only.
    All new users please read this.** Forum FAQS. ** Adopt an unanswered post.

    I'd rather be lost at the lake than found at home.

  4. $spacer_open
    $spacer_close
  5. #4
    Linux User sgosnell's Avatar
    Join Date
    Oct 2010
    Location
    Baja Oklahoma
    Posts
    494
    Yes, vlc is in the Ubuntu repos, but not the latest version.

    When you compile software, you basically get a .bin file, which is the executable program. A .deb file is a complete package, which may include several executables, libraries, text files, or whatever else. If you're compiling for your use on your computer, there is no need for a .deb file in most cases. You just run configure, make, and make install, and you're done. All .deb files are not equal, because different people may decide to include different files. The executables should be the same, though, as long as the basic computer architecture is the same.

  6. #5
    Just Joined! narnold's Avatar
    Join Date
    Nov 2010
    Posts
    4

    Thanks for the responses

    Thank you for the responses; these are the broad generalities that I wanted. As I understand you, sgosnell, the .bin (binary, I presume?) file is the meat and potatoes of the program, but since one can't just throw an executable at a machine and expect it to run right, it generally will need other executables to go along with it? Besides, who doesn't want a side dish of good veggies?
    As far as the .deb file I got, I ran configure, make, and checkinstall, as from what I gathered this sort of has superseded the make install. Maybe I'm off base here. Anyway, I got a .deb file after all was said and done along with other things. The program seems to work, so I'm guessing things basically went as desired.
    But the real effect is that now I know I can do it again when I want to!

  7. #6
    Linux User sgosnell's Avatar
    Join Date
    Oct 2010
    Location
    Baja Oklahoma
    Posts
    494
    The executable doesn't have to have a .bin extension, and in fact rarely does. It's usually just named something like fu, or bar, or make, or whatever. The executable may depend on libraries to work, like Windows executables often require .dll files to work. A .deb file sometimes includes all the dependencies, which may or may not be executable, may be libraries (usually the name starts with lib, but not always), and may include documentation files. There are no hard and fast rules, and a .deb file may certainly fail to include dependencies. It's up the the developer, or whoever does the compilation and distribution. In short, you get at least what you pay for, and often more. Library dependencies can cause headaches, because the developer may assume most people already have the libraries installed. It's not usually a problem, because apt-get and Synaptic check for dependencies, and automatically download anything not already installed. When you do a manual compile/install, you have to figure out everything for yourself. The docs may tell you, or they may not.

    I've never seen a .deb file come out of a make. This webpage shows how to make a .deb package, if you're interested.

  8. #7
    Linux Guru reed9's Avatar
    Join Date
    Feb 2009
    Location
    Boston, MA
    Posts
    4,651
    I've never seen a .deb file come out of a make. This webpage shows how to make a .deb package, if you're interested.
    The poster used checkinstall, not make install, for a quick and dirty deb package.

    When compiling software, there are many variables that can make your build different from another persons. Processor architececture is a big one, software compiled for one platform will not run on another. For example, Arch Linux provides i686 and x86_64 packages. Folks using older processor architectures (i386 - i586) cannot run Arch using the provided binariers. (These are all x86 processors, let alone getting into ARM or MIPS.) There are cross compilers to build for other architectures than the host.

    Beyond that, there are tweaks one can make to optimize your build for particular hardware.
    Gentoo Linux Documentation -- Compilation Optimization Guide

    Also, many pieces of software have varying build options. These are usually passed to the configure script. Going to VLC, which you built, as an example, here is the Arch Linux build script.

    The pertinent part here
    Code:
    ./configure --prefix=/usr \
                  --disable-rpath \
                  --enable-faad \
                  --enable-v4l \
                  --enable-snapshot \
                  --enable-dbus-control \
                  --enable-nls \
                  --enable-lirc \
                  --enable-pvr \
                  --enable-ncurses \
                  --enable-mozilla \
                  --with-live555-tree=/usr/lib/live \
                  --enable-realrtsp
    Different people could disable options they don't want or enable other options. Various options will effect software dependencies. (There are dependencies requiref for the software to run, but also dependencies required only if you need certain functionality.)

    For software with a configure script, usually running
    Code:
    ./configure --help
    will give you a list of available compile time options.

  9. #8
    Just Joined! narnold's Avatar
    Join Date
    Nov 2010
    Posts
    4
    Thank you, reed9....

    I'm still in early-morning mode and am trying to digest this whole thing. But this is when the brain often works the best for me. I'm thinking of a way to reframe my original question to keep my perspective.

    Here is, in a nutshell, the process I followed: I got the tarball, put it in a directory, and unpackaged it. This is easy to do from the Ubuntu GUI. Then I opened a terminal, changed to the directory, and give the ./configure command. I'd run ./configure until it got stuck while looking for a file, which generally turned out to be one of many development libraries. If I recall, I found every one of those in Synaptic; some I had to Google to find the proper package associated with the file listed when the file name was sufficiently different from package name.

    I was suitably impressed with how basically easy, if tedious, this process is. But I was wondering how "good" the install was. Admittedly, I may be laboring under an illusion, but I have/had the idea that a package made by a "real" computer person would be superior to one that I make here. I base that assumption on the simple fact that something that is intended for wide distribution would have a much broader target base than just my laptop here where I did the task.

    I find there are packages available for various applications which one can download and install from packager's sites. Generally, the site will list links for packages suitable to assorted architectures, Linux variations, etc. For many of us, of course, this route has its attractions, by which I mean it feels more like installing Windows software. In Ubuntu it certainly feels very much the same - one double-clicks on a file with a .deb extension and a familiar process unfolds and in a few minutes one has a new program installed.

    And I just wondered to myself how such a package would compare to my install. What prompted the question in my mind is the VCD playback problem that I was experiencing on the older version available in the Ubuntu repositories. My custom install fared no better. So I thought to myself - if I could find a "proper" package, rather than compile my own, might it be somehow better and able to handle VCD playback?

    And now to reframe my question: If I had two computers, both with the same version of Ubuntu installed and with the same architecture - suppose one was this basic laptop and the other were a gaming machine with a potent video and sound card - and I performed the compilation routine I outlined above using the same source code, how would the "quick and dirty" .deb file differ? Again, I'm assuming that "real" packages are somehow better or at least able to accommodate a wider range of machines - can I use that "quick and dirty" .deb file on another machine of mine or should I repeat the compilation steps for every install?

    Thank you all - this has been greatly helpful to me!

  10. #9
    Linux Guru reed9's Avatar
    Join Date
    Feb 2009
    Location
    Boston, MA
    Posts
    4,651
    I don't know all the details of what checkinstall does, but it doesn't create a real .deb file suitable for distribution to other machines. As I understand, it's mostly a quick way to have packages you've compiled yourself tracked by the package manager. (One of the downsides to compiling stuff yourself is it can make clean up and uninstalling difficult, as you may have to track down all the installed files yourself for removal.)

    Assuming hardware-wise the two machines were compatible, then you could, I believe, install the "checkinstall" package on the other machine. In theory, optimizing the compilation for your particular processor can give a performance boost. Something compiled for i386, which is quite old and lacks some abilities of the newer i686 architecture, would run a little slower than something optimized for i686. How noticeable this performance gap is in the real world is a matter of debate.

    The difference in "professional" packages I would say is more in how they fit into the whole software ecosystem of your machine. Because in linux software packages are not usually self-contained, many packages all share the same libraries, you can find installing a newer (or older for that matter) version of this or that program can cause breakage with other pieces of software. Or having custom packages can cause problems when trying to upgrade from one distro release to another.

    Also, distros often apply custom patches to the software for various reasons. For example, looking at the Ubuntu source package for VLC (on the right under 'download source', the link for vlc_1.1.4-1ubuntu1.debian.tar.gz), you'll find in the "Patches" folder a couple of modifications that do not exist in the Arch build. (You could of course download and use those patches in your personal build as well, though.) Arch and Slackware are known for providing "vanilla" packages, patching software only to prevent breakage, but otherwise keeping the software as released by the developers. Other distros like Ubuntu frequently apply custom patches.

    One of the nice things about Ubuntu is the availability of personal package archives. Community members who have built custom packages, often newer packages available in the official repositories. The quality of these packages can vary, and there is a potential security concern installing packages from 3rd parties, but you can find trusted and popular 3rd party repositories for many pieces of software. This can save a lot of time and hassle building packages yourself.

    For example, VLC

    Install VLC 1.1.4 In Ubuntu [Via New PPA!] ~ Web Upd8: Ubuntu / Linux blog
    https://launchpad.net/~ferramroberto/+archive/vlc

    Note the warning in the Webupda8 post
    Important note: I noticed a lot of blogs recommend the n-muench/vlc PPA. DO NOT USE THAT PPA unless you only want VLC to work and no other media player / movie editor or any other package that needs FFmpeg to work. That PPA has newer FFmpeg packages and all applications depending on FFmpeg will be broken! This is not the case for Roberto's PPA - so this one is safe to use.
    Which goes back to my previous mention of being aware of the whole software ecosystem.

    I don't think you mentioned which Ubuntu you're using, but there's also the "Lucid Bleed" team, providing backported Ubuntu 10.10 packages for Ubuntu 10.04.

  11. #10
    Just Joined! narnold's Avatar
    Join Date
    Nov 2010
    Posts
    4
    reed9: Thank you! Now we're cookin' with gas - this is precisely the kind of answer I'm looking for.

    It seems one must therefore be careful with software. I guess I would have assumed (in the given Webupda8 example) that newer FFmpeg packages were backward-compatible, but obviously this is not the case. Or perhaps the packages attempt to be so, but sometimes fail to achieve that goal.

    However, this isn't unusual even in Windows. Although Windows software conflicts are occasional enough to be mostly an inconvenience, they're certainly not unheard-of.

    Again, thank you. I wish I could find a good book that would sort of give broadly outlined insight like this at the beginning of chapters, but I have yet to find one. I realize, however, that this IS a rather tall order .... not easy to recall when one is writing a book that the readership needs to be led by the hand!

Page 1 of 2 1 2 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •