I'd searched the web about how to detect if a connected socket was disconnected suddenly. Mostly, they will advice using read() to see if it returns a 0 (empty string). However, read() blocks and so I tried using a write instead, and expects it to return -1 when the socket is disconnected.

Below is my code in the server that I want to detect any disconnection:
			printf("Socket disconnected");
//do drastic socket disconnection measures here..
The above will work if the client is reading stuff slowly through the network, say one send per 1.5 second. However, if I let the client read alot of data, without any break in between, the above somehow, will not work. The server will ignore the write()==-1 check, and not break out of the while loop. This in turn causes it to crash.

What intrigues me is why a client read() will affect the performance of the server?