I'm reading a stream of binary data off of a UDP socket. I've got 2 integers to read, one is four bytes long and is in buffer[2] to buffer[5], the other is length 1 and in buffer[6]. Doing the following works for some values.
- unsigned int int_x1 = buffer[2]<<24 | buffer[3]<<16 | buffer[4]<<8 | buffer[5];
-
unsigned int int_x2 = buffer[6];
But often the left most bits get stuffed with 1's. For example int_x2 can read 0 through 127 fine, but the next 128 numbers result in very large numbers and it doesn't work again until it hits 256. What's happending is all the bits left of the first "1" are also being set to "1". I know this is a known issue, but I don't know how to code around it.
When I do the following I get garbage data in both int_x's (although oddly int_x2 by itself seems to work for at least some values).
- memcpy (&int_x1, &buffer[2], 4);
-
memcpy (&int_x2, &buffer[6], 1);
And the following code also produces garbage.
- unsigned int int_x1 = *(int *)&buffer[2];