1

I have a function about image feature, when I malloc a buffer (buffer size via read header). The fortify report tell me "Integer Overflow" in here. But, whether I fix code or check value of colors, fortify report still tell me "Integer Overflow"

Anyone have any suggestion?

code:

int ReadInt()
{
    int rnt=0;
    rnt = getc(xxx);
    rnt +=  (getc(xxx)<<8);
    rnt += (getc(xxx)<<16);
    rnt += (getc(xxx)<<24);
    return rnt;
}

int image()
{
....
        image->header_size=ReadInt();
        image->width=ReadInt();
        image->height=ReadInt();
....    
        image->colors =ReadInt();

        int unit_size = 0;
        unit_size = sizeof(unsigned int);
        unsigned int malloc_size = 0;
        if (image->colors > 0 &&
            image->colors < (1024 * 1024 * 128) &&
            unit_size > 0 &&
            unit_size <= 8)
        {

            malloc_size = (image->colors  * unit_size);
            image->palette = (unsigned int *)malloc( malloc_size );
        }

....
        return 0;
}

fortift report:

Abstract: The function image() in xzy.cpp does not account for
integer overflow, which can result in a logic error or a buffer overflow.
Source:  _IO_getc()
59 rnt += (getc(xxx)<<8);
60 rnt += (getc(xxx)<<16);
61 rnt += (getc(xxx)<<24);
62 return rnt;

Sink: malloc()
242 malloc_size = (image->colors * unit_size);
243 image->palette = (unsigned int *)malloc( malloc_size );
244
  • Use an `unsigned int`? Unsigned integers have well defined overflow semantics, whereas signed integers do not. – Cornstalks Oct 26 '18 at 04:01
  • `getc` will return a `char` typecast to an `int`. The return type of `getc` however is `int`. Possibly the tool is considering the range of `getc`incorrectly. – Rishikesh Raje Oct 26 '18 at 04:15
  • fortify could be wrong. You need to take that possibility into account. Try also other static source code analyzers, such as [Frama-C](http://frama-c.com/) and [Clang-analyzer](http://clang-analyzer.llvm.org/). BTW the type of `malloc_size` should be `size_t` – Basile Starynkevitch Oct 26 '18 at 04:43
  • @RishikeshRaje: Well, no; `getc()` might return a `char` promoted to an `int`, but it might also return `EOF` (which is a value that's guaranteed not to be a valid `char`). Imagine how much the compiler will like seeing `rnt += EOF << 24;`... – Brendan Oct 26 '18 at 05:09
  • @Brendan I try to check the value of getc (is EOF or not), but it no effect :( – whatai cheng Oct 26 '18 at 05:37

1 Answers1

0

Left shifting an int has potential for undefined behavior (UB) any time a "one" bit is shifted into the sign bit.

This can happen with arbitrary int values as in some_int << 8.

getc() return a values in the unsigned char range or the negative EOF. Left shifting a EOF is UB. Left shifting a value like 128 with 128 << 24 is UB.

Instead accumulate the non-negative values from getc() using unsigned math.

Recommend to change the function signature to accommodate end-of-file/input errors.

#include <stdbool.h>
#include <limits.h>

// return true on success
bool ReadInt(int *dest) {
  unsigned urnt = 0;
  for (unsigned shift = 0; shift < sizeof urnt * CHAR_BIT; shift += CHAR_BIT) {
    int ch = getc(xxx);
    if (ch == EOF) {
      return false;
    }
    urnt |= ((unsigned) ch) << shift;
  } 
  *dest = (int) urnt;
  return true;
}

The (int) urnt conversion invokes "implementation-defined or an implementation-defined signal is raised" and this is commonly the expected functionality: values in urnt above INT_MAX "wrap around".

Alternatively, pedantic code can employ:

  if (urnt > INT_MAX) {
    *dest = (int) urnt;
  } else {
    *dest = ((int) (urnt - INT_MAX - 1)) - INT_MAX - 1;
  }

Improvements for image->colors * unit_size.

    //int unit_size = 0;
    // unit_size = sizeof(unsigned int);
    size_t unit_size = sizeof(unsigned int);
    // unsigned int malloc_size = 0;
    if (image->colors > 0 &&
        // image->colors < (1024 * 1024 * 128) &&
        image->colors < ((size_t)1024 * 1024 * 128) &&
        unit_size > 0 &&
        unit_size <= 8)
    {
        size_t malloc_size = (size_t) image->colors * unit_size;
        // image->palette = (unsigned int *)malloc( malloc_size );
        image->palette = malloc(malloc_size);

1024 * 1024 * 128 is a problem when INT_MAX < 134217728 (28 bit int).
See There are reasons not to use 1000 * 1000 * 1000

chux - Reinstate Monica
  • 143,097
  • 13
  • 135
  • 256