Say I have a couple of values with uncertainty estimates
double x, sigma_x;
e.g.
45.34302958634 ± 4.25976343
3.52986798343 ± 0.2363467
3.3734874533e+12 ± 6.34659e+6
Clearly, most of those decimals aren't significant. How do I choose the correct number of “significant digits” (what does that even mean?), to always get as many decimals as needed out of printf
, but no more?
I.e. I want some definition of char* fmtString
, dependent on sigma_x
, such that
printf(fmString, x)
yields
45
3.5
3.373487e+12