-4

I'd tried this code by these values:

float timeStamp;
a=1338526801      
    b=113678

 timeStamp = a +( b / 1000000);

then I changed the b to 113680 and calculated the timeStamp,

timeStamp = a+ (b / 1000000) ;

in real the timeStamp should change because the b has been changed, but when I print it by console.writeline(), the timeStamp value doesn't change.I think it refers to the precision of the c# values, but I don't know how to resolve it.

  • 11
    No, you didn't try that code because that doesn't even compile. Please post the actual code you have. Your question/issue is also pretty unclear, please be more clear about what you're trying to do, what you expected to happen, and what actually happened. – tnw Jan 23 '14 at 21:47
  • so use a `double` which is more precise. – Harrison Jan 23 '14 at 21:49
  • 1
    Use `DateTime.Seconds` or convert a `DateTime` to Ticks. floats/doubles are inherently inaccurate. Wikipedia surely has a good explanation of the IEEE floating point standard if you want to know why. – evanmcdonnal Jan 23 '14 at 21:55
  • It seems like you're trying to parse `Unix Time`. Here's how you do it: http://stackoverflow.com/a/20796273/885318 – i3arnon Jan 23 '14 at 22:01
  • No,I don't want to convert time, you can suppose the ts_sec and ts_usec as some usual variables. – marziye esmslampanah Jan 23 '14 at 22:09

1 Answers1

-1

You should take a look at Floating-Point Types Table (C# Reference) which gives the following info

> Type       Approximate range      Precision 
> float      ±1.5e−45 to ±3.4e38    7 digits
> double     ±5.0e−324 to ±1.7e308  15-16 digits

Your combination of 338526801 + 113678/1000000 is about 16 digits and would better fit into a double.

A float which contains 7 digits would get you accuracy to 338526800.000000 and no more

float f = 338526801 + 113678f/1000000
System.Diagnostics.Debug.Print(f.ToString("F6")); // results in 338526800.000000

however a double gets 15-16 digits can actually store the data to your precision.

double d = 338526801d + 113678d/1000000
System.Diagnostics.Debug.Print(d.ToString("F6")); // results in 338526801.113678

You could also look at Timespan and DateTime which give you accuracy to 100-nanosecond units. Since there are 10 ticks in a micro-second (us), the same TimeSpan would be:

TimeSpan time = new TimeSpan(3385268011136780);

One of the comments suggested you might be trying to convert Unix Time. If so then you can add the Timespan to the proper DateTime representing 1/1/1970.

Harrison
  • 3,843
  • 7
  • 22
  • 49