13

I read the wiki page about heisenbug, but don't understand this example. Can anyone explain it in detail?

One common example of a heisenbug is a bug that appears when the program is compiled with an optimizing compiler, but not when the same program is compiled without optimization (as is often done for the purpose of examining it with a debugger). While debugging, values that an optimized program would normally keep in registers are often pushed to main memory. This may affect, for instance, the result of floating-point comparisons, since the value in memory may have smaller range and accuracy than the value in the register.

ivan_pozdeev
  • 33,874
  • 19
  • 107
  • 152
del bao
  • 1,084
  • 1
  • 11
  • 20

3 Answers3

13

Here's a concrete example recently posted:

Infinite loop heisenbug: it exits if I add a printout

It's a really nice specimen because we can all reproduce it: http://ideone.com/rjY5kQ

These bugs are so dependent on very precise features of the platform that people also find them very difficult to reproduce.


In this case when the 'print-out' is omitted the program performs a high precision comparison inside the CPU registers (higher than stored in a double). But to print out the value the compiler decides to move the result to main memory which results in an implicit truncation of the precision. When it uses that truncated value for the comparison it succeeds.

#include <iostream>
#include <cmath>
  
double up = 19.0 + (61.0/125.0);
double down = -32.0 - (2.0/3.0);
double rectangle = (up - down) * 8.0;
 
double f(double x) {
    return (pow(x, 4.0)/500.0) - (pow(x, 2.0)/200.0) - 0.012;
}
 
double g(double x) {
    return -(pow(x, 3.0)/30.0) + (x/20.0) + (1.0/6.0);
}
 
double area_upper(double x, double step) {
    return (((up - f(x)) + (up - f(x + step))) * step) / 2.0;
}
 
double area_lower(double x, double step) {
    return (((g(x) - down) + (g(x + step) - down)) * step) / 2.0;
}
 
double area(double x, double step) {
    return area_upper(x, step) + area_lower(x, step);
}
 
int main() {
    double current = 0, last = 0, step = 1.0;
 
    do {
        last = current;
        step /= 10.0;
        current = 0;
 
        for(double x = 2.0; x < 10.0; x += step) current += area(x, step);
 
        current = rectangle - current;
        current = round(current * 1000.0) / 1000.0;
        //std::cout << current << std::endl; //<-- COMMENT BACK IN TO "FIX" BUG
    } while(current != last);
 
    std::cout << current << std::endl;
    return 0;
}

Edit: Verified bug and fix still exhibit: 03-FEB-22, 20-Feb-17

Persixty
  • 8,165
  • 2
  • 13
  • 35
8

It comes from Uncertainty Principle which basically states that there is a fundamental limit to the precision with which certain pairs of physical properties of a particle can be known simultaneously. If you start observing some particle too closely,(i.e., you know its position precisely) then you can't measure its momentum precisely. (And if you have precise speed, then you can't tell its exact position)

So following this, Heisenbug is a bug which disappears when you are watching closely.

In your example, if you need the program to perform well, you will compile it with optimization and there will be a bug. But as soon as you enter in debugging mode, you will not compile it with optimization which will remove the bug.

So if you start observing the bug too closely, you will be uncertain to know its properties(or unable to find it), which resembles the Heisenberg's Uncertainty Principle and hence called Heisenbug.

saurabh
  • 6,687
  • 7
  • 42
  • 63
0

The idea is that code is compiled to two states - one is normal or debug mode and other is optimised or production mode.

Just as it is important to know what happens to matter at quantum level, we should also know what happens to our code at compiler level!

Dragonborn
  • 1,755
  • 1
  • 16
  • 37