By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
457,949 Members | 1,428 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 457,949 IT Pros & Developers. It's quick & easy.

Question about poll() on Linux

P: 3
I'm try to use poll() to handle network connections on Linux. As a test, I create a simple loop that polls a socket so that everytime the client sends data, the server responds.

My problem is that I can't seem to perform error checking correctly. If I suddenly disconnect the client, I'd expect the server to return an error of some sort. But in fact, it doesn't, and simply loops forever rapidly.

Here is the loop for the server:

Expand|Select|Wrap|Line Numbers
  1. if ((client_fd = accept(server_sock, (sockaddr*) &m_addr, (socklen_t*) &addr_length)) != -1) {
  2.      pollfd pfd;
  3.      pfd.fd = client_fd;
  4.      pfd.events = POLLIN;
  5.      pfd.revents = 0;
  6.      int timeout = -1;
  7.      int retval;
  8.  
  9.     while (true) {
  10.           if ((retval = poll(&pfd, 1, timeout)) > 0) {
  11.                if (pfd.revents & POLLERR) {
  12.                     printf("An error occurred.\n");
  13.                     break;
  14.                }
  15.                if (pfd.revents & POLLHUP) {
  16.                     printf("A hang up occurred.\n");
  17.                     break;
  18.                }
  19.                if (pfd.revents & POLLIN) {
  20.                     if (recv(client_fd, buffer, MAX_RECV, 0) != -1) {
  21.                          printf("Received: %s\n", buffer);
  22.                          *buffer = '\0';
  23.                     }
  24.                     else perror("recv");
  25.                }
  26.           }
  27.           else {
  28.                perror("poll");
  29.                break;
  30.           }
  31.      }
  32. }
And the client simply connects to the server, and repeatedly sends a string, like this:

Expand|Select|Wrap|Line Numbers
  1. char* req = "Hello from the client";
  2. int len = strlen(req);
  3.  
  4. while (true) {
  5.      if (send(client_fd, req, len, 0) == -1) perror("send");
  6.      sleep(5);
  7. }
This works, but if I terminate the client process, the server process launches into an endless loop rather than handling the error.
Feb 11 '07 #1
Share this Question
Share on Google+
1 Reply


RedSon
Expert 5K+
P: 5,000
First, your client's connection to the server was successful so it makes sense that you wouldn't receive an error condition because the server still thinks its connected. So what you have to do is design some kind of timeout for your server. If you don't get any data from the client after, say, 10 seconds then time out. Looks like your timeout is -1 which probably means INFINITY, so you would want to set that to 10 or if its milliseconds do like 5000 or something.
Feb 12 '07 #2

Post your reply

Sign in to post your reply or Sign up for a free account.