Let's talk about clusters
Whenever I read about clusters, supposedly a person needs special software. This software has to be made for batch processing. In other words, the software/program was specifically made for computer clusters.
Is there any way to simply hook up a bunch of computers and have them process a task, such as processing the data from any random program without it having the be made for a cluster?
I'm talking about reducing the amount of system resources needed by one computer, while being able to take any program and have it processed by multiple computers.
For example, have firefox, google earth, gimp, and games processed by multiple computers without those programs being made for cluster technology in the first place.
I envision it like this:
I install like three ethernet port in a few computers, I connect all of them, and they will process the data.
What I will have in the background is a program that reads the data to be processed and sends that to other computers.
Those computers then process the data and shoot it back to the main computer.
The result is then put together for a whole.
Is any of that possible?
Or must programs be made for clusters?
The way I'm looking at it, GIMP, google earth, firefox, and other programs would have to be made for clusters. It seems as if there is no kind of daemon to handle these tasks with other computers.
I read that, and it seems to say running one process among multiple computers is a difficult task.
I think there is a need for this, but I'm also thinking it would run very slowly.
Regardless, I think it would be nice to have.
Doing a single process with a load-balancing cluster would mean old computers never seem so old.