Results 1 to 3 of 3
Thread: Installing Software
Enjoy an ad free experience by logging in. Not a member yet? Register.
- Join Date
- Jan 2011
I was thinking about making couple of Linux from Scratch boxes, and I was wondering what the best way for install software on this boxes is. Ideally I would only like to Compile the software on one of the boxes and then install the newly created binaries on the other boxes so I don't have to have all the compile programs and such on all the boxes. not to mention don't really want to have to recompile the program 5 times etc.
- Join Date
- May 2006
Depends on what distro you are using but most of the major distros today have GUI install apps such as Yum extender or Synaptics which take all the pain out of installing software. Especially when you install thousands of packages like I do. That seems to me that it would be much easier than downloading tarballs, finding the dependencies and compiling each package as you go along.
Again with most distros there are software repositories which can take most of the pain out of installing and usually prevent dependency hell from occuring. You can also batch the installation of these and run that batch script on both machines. Since you only have 2 the first one will help you work out the bugs and you should have a fully functional script to just run on the second one. Redhat based distros will use yum, Debian based will usually use apt-get.
A third more painful option is to create a repository of your own on one machine. Work out the dependency issues and then serve up packages to the other machine. This might be a better option if you are using a more primitive distro or heavily customizing a distro with packages not found in that distro's repositories.
The last and most painful option is to grab tarballs for every package you want to install. One at a time start to compile them, then one at a time hunt down the dependencies, then start installing the dependencies and find the dependencies your dependencies have and so on until you've got all the tarballs you'll need. The write quick script prob using bash to install them in the necessary order to satisfy the dependencies. Then scp the whole mess over to the other machine and run the script. This means keeping up with updates for each and every one of the packages and installing them as needed. Aside from the advantage of never having to reinstall your distro as long as the hardware doesn't fail it is generally not a good option. You might miss an important security update and get hacked, might find yourself with circular dependencies meaning no matter what you do your not going to install that package. You may have to remove certain components to install others. In general it is a nightmare. One from Linux's past that most people avoid if at all possible.
- Join Date
- Feb 2011
- Patagonia argentina
You mean LFS? I built one a couple years ago. I'm pretty sure you can do it that way, but I'm no expert so I don't know if I'm missing something.
The LFS book advises to use a package manager, and I think it's what I would do to keep things clean.