Just an Idea I had
Firstly, I don't know how one would set this up or if it's even possible in that case, as I wouldn't consider myself a programmer (some java, html, and basic....but not much and this was a while back).
On to the idea...
Most people know the idea behind Folding@Home. It's essentially a project designed to discover more information about protein folding using multiple computers that use a certain program. These computers could use some of their idle processor time toward this project and speed up the processing of this project.
Then there's distcc. A program that splits compiling across a network.
Well, the idea I had was similar, but would be very useful to Gentoo especially. Most people complain about the long compile times or updates involved with Gentoo (especially if your computer is <1 Ghz). I thought that it would be very useful to somehow set up a program similar to Folding@Home that could be used to split up the compiling of the programs for people who are using is. Let's say, it's called "Gentoo@Home"...just a hypothetical name. Whoever is using "Gentoo@Home" would get a speed increase when compiling some packages since it would be split up to the other computers involved (like a combination of Folding@Home and distcc if you will). I'm sure, in the beginning the speed increase would be none (or even a decrease since you would be sending lots of information across the internet), but once more people start using it, I'd think the speed increase would be impressive....and since many people who use Gentoo are on broadband (I'd assume so since there are many large downloads and since Gentoo is very internet-dependent) the downloads/uploads should be fairly quick.
The idea is that, if we get many people to use this, the one annoyance of Gentoo (long compile times) would be eliminated making Gentoo an excellent all around OS, as you still compile from source...but it won't take long.
What is everyone's thoughts on this? Is it possible to do? Safe? Would it be a worthwile step?
To me this sounds like a great idea.
Implementation might be a bear though.
The problem is that Folding@homes work units (as in a miniscule fraction of the total computing) can take days to complete. Basicaly the issue is, what happens when your wu gets handed to the old 233Mhz Pentium Pro. Since its not just distributed contribution, but each user also submits custom WU's only the bottom half of the users would really benifit. Also, network speeds aren't really high enough to make that fesable. The Folding system submits a relativly small set of data, the upload is larger, but still not mamoth. The thing is that even with a caching system, you would still have frequent patching, eventualy the managment of a lowbandwith patched system would be enourmus. Basicaly, since FAH's work submissions are coordinated by one server, it works. distcc isnt smart enough to handle this type of thing. It just picks the node with the most available resources. but processing that info would be near impossible. Throw in how people could be dicks as mess with code submissions (thus limiting how opensourced the system could be / force md5 or sha1 hashing. (just another resource drain on the cluster))
While its a novel idea, it really just would collapse from its own weight I feel.
How about a similar idea in which you daisy-chain the computers(if you have more than one) together and use their combined processing power. This would essentially be like in the 3D animation world where they tag team a bunch of systems together and do a very similar thing. I believe they call it a render farm. Some lower budget guys will buy up old computers solely for this purpose.
That would be distcc. I'm planning to use it to stage 1 install gentoo on my 200mhz P1..