Find the answer to your Linux question:
Results 1 to 4 of 4
Hi, I have written a server code that accepts TCP/IP socket connections from client. The server code has listen() system call with its last argument(backlog) set to 1. This way ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Nov 2006
    Location
    Harrisburg, PA, USA
    Posts
    56

    Client-Server Socket Connection Question


    Hi,

    I have written a server code that accepts TCP/IP socket connections from client. The server code has listen() system call with its last argument(backlog) set to 1. This way there can be maximum of 1 pending connections. Also server accepts incoming client socket connections from any IP address.

    But I have observed that, even after one successful client connection, if I try to connect more than 1 client, all the clients connect successfully to the server. My understanding is that unless server calls accept() system call, it does not establish socket connection with client. But still client is getting connected successfully.

    Is underlying linux TCP/IP stack layer handle this?
    Please explain this behavior.

    Regards,
    Sumit

  2. #2
    Linux Guru Rubberman's Avatar
    Join Date
    Apr 2009
    Location
    I can be found either 40 miles west of Chicago, in Chicago, or in a galaxy far, far away.
    Posts
    11,448
    The backlog is only relevant to SIMULTANEOUS connection requests. It has NOTHING to do with how many actual connections the server can handle. This is a common misconception amongst newbie socket programmers.
    Sometimes, real fast is almost as good as real time.
    Just remember, Semper Gumbi - always be flexible!

  3. #3
    Just Joined!
    Join Date
    Nov 2006
    Location
    Harrisburg, PA, USA
    Posts
    56
    Thanks. It is clear now.

    Regards,
    Sumit

  4. #4
    Linux Guru Rubberman's Avatar
    Join Date
    Apr 2009
    Location
    I can be found either 40 miles west of Chicago, in Chicago, or in a galaxy far, far away.
    Posts
    11,448
    So, if you want to limit the server to a single connection, then you need to monitor the connection in the server, and reject any connection requests until the current client has disconnected, or the socket has timed out if you wish to do that - a good idea in order to keep one client from "hogging" the server's resources.
    Sometimes, real fast is almost as good as real time.
    Just remember, Semper Gumbi - always be flexible!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •