This page says the following:
ClojureScript currently only supports integer and floating point literals that map to JavaScript primitives
Can someone tell me what that actually means for integer? Are they 64-bits (probably not, since it would take two JS numbers for that)? Or 32-bits? Or 53-bits (that being the maximum integer bits in a double AFAIK, see here)?
[EDIT] The reason I want to know this, is that I want to write a "simulation", using a "cross-platform" language, such that the simulation gives the same results in the client (Browser(JS), Android, Web-Start, ...) and the Server (JVM). Floating points are know to cause "de-synchronisation" for simulations, because different hardware can give different results for the same calculation, with the same input. Therefore I want to use "integers" only, but if the size of integers differs between Clojure and ClojureScript, I'm still going to get "de-synchronisation" eventually (for example, when reaching integer overflow, which is used in random number generators, which are heavily used in simulations).