You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem or challenge? Please describe what you are trying to do.
The arbitrary_precision feature flag added in #779 alters how serde_json decodes numeric types, decoding them as maps with special keys, instead of their native types.
Unfortunately this has a tendency to break downstream code in strange ways - see here. The nature of feature flags makes this both unexpected, hard to diagnose, and impossible to opt out of.
Describe the solution you'd like
I have filed a ticket for native 128-bit support in serde_json here which I think is the ideal solution, but until then I would like to propose we encode 128-bit numbers as strings.
FWIW I'm not sure this will necessarily be any more or less broken than the status quo, as whilst the official JSON specification states that numbers should be arbitrary precision, a number of implementations assume values fit into 64-bit doubles (i.e. 52-bits of integer precision). It is for this reason that protobuf's JSON format actually encodes 64-bit integers, let alone 128-bit integers as strings - see here.
Additionally the arrow docs seem to suggest that data in JSON is intepreted as i64 - see here. It is unclear how it expects i128 to be encoded/decoded.
Describe alternatives you've considered
Add a feature flag to allow users to opt-in to using arbitrary_precision.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem or challenge? Please describe what you are trying to do.
The
arbitrary_precision
feature flag added in #779 alters how serde_json decodes numeric types, decoding them as maps with special keys, instead of their native types.Unfortunately this has a tendency to break downstream code in strange ways - see here. The nature of feature flags makes this both unexpected, hard to diagnose, and impossible to opt out of.
Describe the solution you'd like
I have filed a ticket for native 128-bit support in serde_json here which I think is the ideal solution, but until then I would like to propose we encode 128-bit numbers as strings.
FWIW I'm not sure this will necessarily be any more or less broken than the status quo, as whilst the official JSON specification states that numbers should be arbitrary precision, a number of implementations assume values fit into 64-bit doubles (i.e. 52-bits of integer precision). It is for this reason that protobuf's JSON format actually encodes 64-bit integers, let alone 128-bit integers as strings - see here.
Additionally the arrow docs seem to suggest that data in JSON is intepreted as i64 - see here. It is unclear how it expects i128 to be encoded/decoded.
Describe alternatives you've considered
Add a feature flag to allow users to opt-in to using
arbitrary_precision
.The text was updated successfully, but these errors were encountered: