This is about string representation; I've explained this in other threads. JS has one number format, but it *pretends* to make the distinction when converting to string, by notating floats with exact integer values differently (without a decimal point, unlike other languages).
4611686018427387904 (decimal) == 2^62 == 0x1.0000000000000p+62 (float64) == [43 d0 00 00 00 00 00 00] (big-endian hex encoding of float64) are all representations of the same, exact, precise integer, with no loss or rounding. But JS's toString returns 4611686018427388000.
-
-
Literally, the bytes inside a JS interpreter's RAM will contain [43 d0 00 00 00 00 00 00], per the IEEE 754 spec that *means 4611686018427387904*, and yet you call toString on that value and it returns something else. I don't know how to make this any clearer.
-
Put another way: when given certain integer-valued floats and told to convert them to a string, JS is *returning a string that contains a different integer*. Sure it *rounds* to the same integer when converted to a float, but why muck it up?
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.