I have written a server code using the Windows Socket API's. Wherein I have
created the socket and bound it to a particular IP address and port number.
Later I have made the socket in non-blocking mode by using the proper socket
option ( i.e. SO_RCVTIMEO). After which with the use of recv() I am trying to
get into the receive mode. Here as the receive time out is being used the
socket should come out of the block mode after the time out value. But this
is not happening here, instead the socket is always in the blocking mode.
I am facing this kind of an issue only on few of the VISTA systems. Whereas
on few other VISTA systems it is working fine (means we are ableto switch
between block and non-block modes).
But there is no problem on any of the XP systems.
Could you please provide me with information related to this and help in
deriving a proper conclusion.
Thanks in advance,
Rajni
"War Eagle" wrote:
clientSocket.Receive(PreRxBuffer, 0,4, 0);
Does this function block? What happens if the client only sends 3 bytes? What if the client sends 5 bytes or 5000 bytes?