Results 1 to 4 of 4
I have a C++ program that uses unix domain socket in this way: Client A has a socket connection to Server B. A sends some data to B. B processes ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
- 03-30-2007 #1
- Join Date
- Mar 2007
How to write a timer to monitor a socket server?
However, sometimes, B might take too long to process data and in such a case, for efficiency consideration, A would expect B to simply terminate the processing and notify A with a failure message other than waiting for ever. So, in B, I would like to write a timer T. It starts timing when B receives data from A. It would timeout if B spends some predefined length of time on processing the data. Then B will stop and sends some message back to A to let it know.
Any idea or suggestions for doing this? Thanks!
- 03-30-2007 #2
Use select() call and FD_SETs for this.
- 03-30-2007 #3
Originally Posted by cyberinstru
- Join Date
- Oct 2004
You already know what you want! So, what sort of help are you looking from us?The Unforgiven
Registered Linux User #358564
- 03-30-2007 #4
You can check if you have rcvd data to any descriptor using FD_ISSET.... so based on that u can proceed... or else, how do u know, if you r rvng data or not...
May be i am not aware of the other way around to know that... cud u plz brief the_unforgiven?