I wrote a small server/client app. I created a class for my customized
socket object using (synchronous) Sockets functions for both server
and client use.
When client connects to server, if It sends something to server,
everything works fine (i.e. server received what client sent). But if
I try to make server send anything, client receive 0 byte although
server send a correct bytes size.
Any idea ?
When I send/receive in server side, I use client socket what I get
when serverSock.Accept();
[Server Side]
clientSock = serverSock.Accept();
int bytesSent = clientSock.Send(buffer, buffer.Length,
SocketFlags.None);
=> bytesSent > 0
[Client Side]
clientSock.Connect();
int bytesReceived = client.Receive(buffer, 0, clientSock.Available,
SocketFlags.None);
=> bytesReceived = 0 !!
If I send in client and receive in server instead, it works...