I have a server written that upon getting a connection, immediately does:
Packet packet = new Packet();
clientSocket.BeginReceive(packet.buffer, 0, packet.buffer.Length, 0, new
System.AsyncCallback(ClientReceiveCallback), packet);
Then I have:
private void ClientReceiveCallback(IAsyncResult ar)
{
if (!clientSocket.Connected)
return;
Packet packet = (Packet)ar.AsyncState;
// Read data from the client socket.
packet.bytesRead += clientSocket.EndReceive(ar);
// do stuff with info read in packet
Packet newPacket = new Packet();
clientSocket.BeginReceive(newPacket.buffer, 0, newPacket.buffer.Length,
SocketFlags.None, new System.AsyncCallback(ClientReceiveCallback),
newPacket);
//forward to the server
if (packet.bytesRead != 0)
serverSocket.BeginSend(packet.buffer, 0, packet.bytesRead,
SocketFlags.None, new System.AsyncCallback(ServerSendCallback), packet);
}
However, when running my app, I noticed that my CPU usage was sitting at
100%. I debugged in and found out that ClientReceiveCallback is CONSTANTLY
being called - but EndReceive is returning 0 - so there weren't actually any
bytes there to be read. Any ideas why this is happening?
Is having the BeginSend on the serverSocket going on at the same time as the
BeginReceive on clientSocket messing something up?
Thanks
Adam Clauss
ca*****@tamu.edu