Virtual Thick Client
I wanted to develop a kernel for my server. It will work in following way:
* Notepad++ process is running on client(local data) w/o actually being installed on client, it is rather installed on server, so all libraries/files are on server
* So when client wants to run notepad++, it requests server for notepad++ process. The process will be running locally(client) whereas lib are on server, so I want to establish communication between process and libraries for running notepad++
* Finally client can just close connection, and the process is gone with no trace of notepad++ since it is not installed on client. So its like "use and throw service".
Please help me out, I am a newbie. Sorry for explaining same thing again and again.
->APP (lib etc..)installed on SERVER
->DATA present on CLIENT i.e local
->APP's PROCESS running on CLIENT
You can do this already without futzing with building a custom kernel (a non-trivial process). If it's a Windows client and Linux server, run Samba on the server and mount the directory as a Windows share. If the client is Linux as well as the server, then you can either mount the remote directory with CIFS (Linux Samba client file system type), or if it is exposed as a mountable NFS directory, you can NFS mount it locally. Then just set the location of the executable in your PATH environment, and shared libraries in the LD_LIBRARY_PATH environment variable.
Thnx for replying.
I used notepad++ as example, say i want to run an IDE (Eclipse) on client system w/o installing it on client rather than jus providing temp process to client .
Yes its more like serving /opt via NFS .
For simplcity i will use a Linux Client ....
Is it feasible to make an linux based server which as s/w installed that will b used by clients as temp process running on client.
The only diff between a cloud and my server will be that d processing will take @client rather than on server(as in cloud) even though the software is installed on server itself.
Ok. Thanks for the clarification. My advice is not to do this, but to provide a centralized location of a default Eclipse setup (plugins, directories, etc) and a script to install it on each users' system. Why? Because that way, if the server is down, users can continue to work. Don't confuse simplicity (control) with reliability and productivity.
Hey Rubberman, thnx for replying
This way I am going to provide them software as service till they want w/o burden of having the sw installed on clients system
Now why such complications??
At this moment i jus want it for Eclipse...in future i may installed more S/w and then for my clients will more than one s/w to work with w/o installing them locally
So clients jus need a good conn and ram to run s/ws with no hdd or small hdd for storing their local data......
So its enjoying software as service locally.........:)
So plz if you could help me out.......
my email ID is email@example.com [ cn u provide ur email ID for further commn. (Y) ]