1
int main()
{
    int array[10];
    memset(array, INT_MIN, sizeof(array));
    cout << INT_MIN << endl;
    for (int i = 0; i < 10; i++)
        cout << array[i] << endl;
    system("pause");
}

Just like that, when I using "memset(array, -1, sizeof(array))", I will get correct result. However, when I using INT_MIN instead of -1, all the outputs are 0, but the INT_MIN should be -2147483648: outputs: results image

2 Answers2

4

The "problem" with memset is that it doesn't fill in the int you give it, but the unsigned char conversion thereof. See here:

Value to be set. The value is passed as an int, but the function fills the block of memory using the unsigned char conversion of this value.

Due to this, if you give it INT_MIN as the second parameter, you will end up with 0. For this specific task, I'm afraid that memset is not the right tool for the job, because it sets each byte individually and thus won't allow you to set an int at will, unless the desired pattern consists of the same byte over and over, which is not the case here.

Blaze
  • 16,736
  • 2
  • 25
  • 44
  • 3
    To clarify why `memset` is not right tool: It can only be used to set all bytes to same value, so it can only result in multi-byte numbers whose all bytes are the same, which in the case of four byte numbers of eight bit byte are all multiples of 16843009. Since sign representation is implementation defined, one is not guaranteed any particular negative number to have such representation. Very rarely is it used for setting to anything other than 0, or -1, and the latter only for unsigned numbers unless it is acceptable to rely on 2's complement. – eerorika Oct 23 '18 at 12:54
  • Very good point, I added it to the answer. – Blaze Oct 23 '18 at 12:59
4

The instruction:

memset(array, INT_MIN, sizeof(array));

sets every byte by result of cast: (unsigned char)INT_MIN, because it operates by bytes, but not elements of array.

zcorvid
  • 386
  • 1
  • 11
  • 2
    Exactly right. And the odd thing you'll find is that the 2's complement representation of INT_MIN is 1000 0000 0000 0000 0000 0000 0000 0000, so when you cast to unsigned char, you get 0000 0000. – jwismar Oct 23 '18 at 12:44