I have this small snippet of code (this is a minimal working example of the problem I have):
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
void xorBuffer(unsigned char* dst, unsigned char* src, int len)
{
while (len != 0)
{
*dst ^= *src;
dst++;
src++;
len--;
}
}
int main()
{
unsigned char* a = malloc(32);
unsigned char* b = malloc(32);
int t;
memset(a, 0xAA, 32);
memset(b, 0xBB, 32);
xorBuffer(a, b, 32);
printf("result = ");
for (t = 0; t < 32; t++) printf("%.2x", a[t]);
printf("\n");
return 0;
}
This code is supposed to perform the exclusive-or of two 32-byte memory buffers (conceptually, this should do a = a ^ b
). Since 0xAA ^ 0xBB = 0x11, it should print "11" thirty-two times.
My problem is, when I compile this under MinGW-GCC (Windows), this works perfectly in debug mode (no optimizations) but crashes with a SIGILL midway through the xorBuffer loop when optimizations starting from -O3 are enabled. Also, if I put a printf in the offending loop, it'll work perfectly again. I suspect stack corruption but I just don't see what I'm doing wrong here.
Trying to debug with GDB with optimizations enabled is a lost cause as all GDB shows me is "variable optimized out" for every variable (and, of course, if I try and printf a variable out, it'll suddenly work).
Does anybody know what the heck is going on here? I have spent too long dwelling on this issue, and I really need to fix it properly to move on. My guess is I am missing some fundamental C pointer knowledge, but to me the code looks correct. It could be from the buffer incrementation, but as far as I know, sizeof(unsigned char) == 1
, so it should be going through each byte one by one.
For what it's worth, the code works even with optimizations on GCC on my Linux box.
So... what's the deal here? Thanks!
As requested, the assembly output of the whole program:
With -O2: clicky
With -O3: clicky
I observe this behavior on GCC 4.6.2 (running with MinGW)