Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

JSON numbers are not JavaScript Numbers.

While the grammar is specified (that’s what JSON is, after all), the runtime representation is unspecified. A conformant JSON parser can parse “1” as 1.0. They can be backed by doubles, or singles, or arbitrary precision.



That's exactly my point. If one needs an integer or a float, the parser converts it. I ask under what use case is there ever ambiguity?


> If one needs an integer or a float, the parser converts it.

Which parser? That’s the problem: if you’re using JSON as a data interchange format, you’ll need to carefully control both the serializers and deserializers, and whatever libraries you use, they will need to (at least internally) hold onto the number in a lossless way — I am not aware of any libraries that do this. They all parse the number as an f64 before any deserializers run. If your input JSON contains a u128, then you’ll have a loss of precision when your type is deserialized.

If you can set up (de)serialization to work the way you need it, then there’s no problem. But if you share your JSON serialized data with other parties, then you/they may be in for a bit of a surprise.

You might find it a worth while exercise to try parsing JSON containing an arbitrary unsigned 128 bit integer in your language of choice.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: