-1

I'm trying to demonstrate an integer overflow bug and its consequences by writing a small code as follows:

int main(int argc, char** argv)
{
    size_t len = 0;
    sscanf (argv[1], "%lu", &len);
    char* buffer = malloc(len + 5);
    strcpy (buffer, argv[2]);
    printf("str = \'%s\'\n", buffer)
    return 0;
}

A safe input to this program is like this:

./program  16  "This is a string"

Where an unsafe input to demo the integer overflow is like this:

./program  18446744073709551613  "`perl -e 'print "This is a very very large string "x20'`"

Yet to my surprise, even though the integer overflow is happening and a very small buffer is being allocated, the program does NOT produce any SEGMENTATION FAULT and the program executes fine without any problems to the end!

Can someone explain why this is ?

I'm compiling this with GCC-5.2.1 and running on 64-bit Ubuntu system.

A more complete version of the code can be viewed here.

Seyed Mohammad
  • 798
  • 10
  • 29
  • 2
    Undefined behavior is undefined. It is not required to do anything in particular, including "crashing". – EOF Jun 27 '16 at 14:15
  • Marking this question as a duplicate of "http://stackoverflow.com/questions/31450678" is **ridiculous**! This question isn't asking why we need `malloc`, it's asking: *why copying a large string to a small buffer on heap doesn't cause a SEG_FAULT*. – Seyed Mohammad Jun 27 '16 at 14:28
  • @SeyedMohammad Agreed that it isn't the best duplicate, but the problem is the same. When writing outside allocated memory, anything can happen. Nobody is _required_ to give you a seg fault. Just like you aren't guaranteed to get hit by a car if you go jogging on the highway. Even though it is likely that you get hit by a car, you may as well make it unscratched. This time. – Lundin Jun 27 '16 at 15:07

1 Answers1

2

What you're seeing here is simply undefined behavior. It may work sometimrs but really randomly and often will fail in more complex scenarios - like allocating another buffer and later freeing them both etc.

Specifically here C library allocates bigger chunk of memory and splits it according to demand. In other words the memory behind the buffer still exists and from OS point of view it's valid. But writing there corrupt sooner or later another buffer or its linking and will cause either fault or just unexpected content.

Zbynek Vyskovsky - kvr000
  • 18,186
  • 3
  • 35
  • 43
  • Thanks. The actual problem was that I was expecting an immediate feedback similar to a "stack-based overflow"; while a "heap-based overflow" in a small and simple program like this might never produce a crash ... Why? ... Because there's a lot of space on the heap and it doesn't affect the program control flow (contrary to the stack), unless some other allocating and freeing is done. – Seyed Mohammad Jun 27 '16 at 14:37