5

I found a snippet similar to this in some (C++) code I'm preparing for a 64-bit port.

int n;
size_t pos, npos;

/* ... initialization ... */

while((pos = find(ch, start)) != npos)
{
    /* ... advance start position ... */

    n++; // this will overflow if the loop iterates too many times
}

While I seriously doubt this would actually cause a problem in even memory-intensive applications, it's worth looking at from a theoretical standpoint because similar errors could surface that will cause problems. (Change n to a short in the above example and even small files could overflow the counter.)

Static analysis tools are useful, but they can't detect this kind of error directly. (Not yet, anyway.) The counter n doesn't participate in the while expression at all, so this isn't as simple as other loops (where typecasting errors give the error away). Any tool would need to determine that the loop would execute more than 231 times, but that means it needs to be able to estimate how many times the expression (pos = find(ch, start)) != npos will evaluate as true—no small feat! Even if a tool could determine that the loop could execute more than 231 times (say, because it recognizes the find function is working on a string), how could it know that the loop won't execute more than 264 times, overflowing a size_t value, too?

It seems clear that to conclusively identify and fix this kind of error requires a human eye, but are there patterns that give away this kind of error so it can be manually inspected? What similar errors exist that I should be watchful for?

EDIT 1: Since short, int and long types are inherently problematic, this kind of error could be found by examining every instance of those types. However, given their ubiquity in legacy C++ code, I'm not sure this is practical for a large piece of software. What else gives away this error? Is each while loop likely to exhibit some kind of error like this? (for loops certainly aren't immune to it!) How bad is this kind of error if we're not dealing with 16-bit types like short?

EDIT 2: Here's another example, showing how this error appears in a for loop.

int i = 0;
for (iter = c.begin(); iter != c.end(); iter++, i++)
{
    /* ... */
}

It's fundamentally the same problem: loops are counting on some variable that never directly interacts with a wider type. The variable can still overflow, but no compiler or tool detects a casting error. (Strictly speaking, there is none.)

EDIT 3: The code I'm working with is very large. (10-15 million lines of code for C++ alone.) It's infeasible to inspect all of it, so I'm specifically interested in ways to identify this sort of problem (even if it results in a high false-positive rate) automatically.

Henry Merriam
  • 834
  • 6
  • 20

3 Answers3

1

Code reviews. Get a bunch of smart people looking at the code.

Use of short, int, or long is a warning sign, because the range of these types isn't defined in the standard. Most usage should be changed to the new int_fastN_t types in <stdint.h>, usage dealing with serialization to intN_t. Well, actually these <stdint.h> types should be used to typedef new application-specific types.

This example really ought to be:

typedef int_fast32_t linecount_appt;
linecount_appt n;

This expresses a design assumption that linecount fits in 32 bits, and also makes it easy to fix the code if the design requirements change.

Ben Voigt
  • 277,958
  • 43
  • 419
  • 720
  • 3
    What if there are >10M lines of code to look at? Is it feasible to look at every single short, int and long? Asked a different way: for 64-bit portability, is there always a better alternative to using short, int or long? – Henry Merriam Jun 13 '11 at 21:26
  • 1
    The new types in `` are not currently in standard C++, and may well not be available on the implementation in use. Moreover, I have philosophical exceptions to having to specify size in bits everywhere. – David Thornley Jun 13 '11 at 21:48
  • @David: That's why I suggest application-specific typedefs, so you aren't specifying size in bits everywhere, but just one typedef which can be changed at need. – Ben Voigt Jun 14 '11 at 00:48
  • @David Thornley, the `` types are indeed not available in Visual Studio. Can you recommend (possibly as another answer) some way to handle problematic shorts, ints and longs? – Henry Merriam Jun 14 '11 at 14:37
  • @Henry Merriam: Visual Studio doesn't implement C99, which can be annoying, but `` is basically a list of typedefs and macros. `int32_t` is a 32-bit integer type (if available), `int_least32_t` is the smallest integer type that's at least 32 bits, `int_fast32_t` is the fastest integer type that's at least 32 bits, and other numbers (8, 16,32, and 64 are required for fast and least) may be substituted for the 32. (You can probably figure out what happens if you prepend a `u` to any of those types.) – David Thornley Jun 14 '11 at 14:55
  • 1
    For reference: http://stackoverflow.com/questions/126279/c99-stdint-h-header-and-ms-visual-studio – Ben Voigt Jun 14 '11 at 15:14
  • Types like `uint32_t` may seem like they should help, but given C's horrible integer promotion rules they can pose traps for the unwary. For example, `uint32_t x=3037000500; uint32_t y=x*x;` would have defined behavior on systems where `int` is 16 bits or 32 bits, but Undefined Behavior on systems where `int` is 64 bits. – supercat Mar 14 '14 at 03:35
1

Its clear what you need is a smart "range" analyzer tool to determine what the range of values are that are computed vs the type in which those values are being stored. (Your fundamental objection is to that smart range analyzer being a person). You might need some additional code annotations (manually well-placed typedefs or assertions that provide explicit range constraints) to enable a good analysis, and to handle otherwise apparantly arbitrarily large user input.

You'd need special checks to handle the place where C/C++ says the arithmetic is legal but dumb (e.g., assumption that you don't want [twos complement] overflows). For your n++ example, (equivalent to n_after=n_before+1), n_before can be 2^31-1 (because of your observations about strings), so n_before+1 can be 2^32 which is overflow. (I think standard C/C++ semantics says that overflow to -0 without complaint is OK).

Our DMS Software Reengineering Toolkit in fact has range analysis machinery built in... but it is not presently connected to the DMS's C++ front end; we can only peddle so fast :-{ [We have used it on COBOL programs for different problems involving ranges].

In the absence of such range analysis, you could probably detect the existing of loops with such dependent flows; the value of n clearly depends on the loop count. I suspect this would get you every loop in the program that had a side effect, which might not be that much help.

Another poster suggests somehow redeclaring all the int-like declarations using application specific types (e.g., *linecount_appt*) and then typedef'ing those to value that work for your application. To do this, I'd think you'd have to classify each int-like declaration into categories (e.g., "these declarations are all *linecount_appt*"). Doing this by manual inspection for 10M SLOC seems pretty hard and very error prone. Finding all declarations which receive (by assignment) values from the "same" value sources might be a way to get hints about where such application types are. You'd want to be able to mechanically find such groups of declarations, and then have some tool automatically replace the actual declarations with a designated application type (e.g., *linecount_appt*). This is likely somewhat easier than doing precise range analysis.

Ira Baxter
  • 93,541
  • 22
  • 172
  • 341
  • 1
    I agree that static analysis could help find out what domain-specific types variables ought to have (basically a form of dimensional analysis). Actually, if strong typedefs get added to the language, the compiler will do this without need for a separate tool. Range analysis won't really work on this example, because the range is defined by user input. I really like the idea of static analysis, counting lines of user input just seems to be a case that needs to be handled by design and can't be analyzed without some manual annotation. – Ben Voigt Jun 23 '11 at 08:44
  • 1
    @Ben Voight: Not sure what you mean by "strong typedefs". If this is a proposed addition to the C++ language, it is pretty unlikely to help the OP anytime soon. Regarding range analysis: yes, some values are defined only by what users type in; for such values, without any constraint, a range analysis will likely identify them as requiring infinite precision. That's curable with likely some manual well-placed typedefs to constrain those values, and running the range analysis again. But the main point here is that the scale of the OP's task, static analysis of some kind is likely necessary. – Ira Baxter Jun 23 '11 at 08:50
  • @Ben Voight: Thanks for the link. I assume these aren't in the forthcoming C++ standard? – Ira Baxter Jun 23 '11 at 08:59
  • Newer version here: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2006/n2141.html Not in C++0x, the working group categorized it as "Not ready for C++0x, but open to resubmit in future" – Ben Voigt Jun 23 '11 at 09:05
0

There are tools that help find such issues. I won't give any links here because the ones I know of are commercial but should be pretty easy to find.

Nemanja Trifunovic
  • 24,346
  • 3
  • 50
  • 88
  • I have been looking, and commercial is not a problem. Can you give some examples? – Henry Merriam Jun 15 '11 at 18:54
  • I have looked at this tool, and it cannot find this kind of error. It only detects things like inappropriate types as array indexes or bad type conversions. This is a more complex error because the variable at risk of overflow (`n` or `i` in the above examples, respectively) doesn't participate directly in any expression with other, incompatible types. – Henry Merriam Jun 15 '11 at 20:00