I know that, for as much as we want to believe computers to be unerring, transistors are not perfect and 1 + 1 will not always return 2, at a transistor level.
I also know that, to protect us from errors, most computers nowadays have redundancy, error detection and correction algorithms.
That being said, what are the chances of the following C++ program printing the wrong result, without warning? Is there even a chance?
#include <iostream>
using namespace std;
int main()
{
int a = 1, b = 1;
int sum = a + b;
cout << "Sum = " << sum;
return 0;
}
Let's assume we are using an average x64 $1000 laptop, as of 2020.
This question has a broader scope. We run billions of calculations per second, I want to know how much can go wrong in a complex program, on a theoretical level.