-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Linter fails on exclusiveMaximum
value with a large integer
#235
Comments
Hey @dlo , That's an interesting one! The ECMA-404 standard for JSON defines numbers as having arbitrary infinite precision. i.e. according to it, you could have a 512-bit integer there! While the IETF RFC 8259 version of it clarifies that anything beyond 64-bit is not guaranteed to be interoperable:
We maintain our own JSON parser (here: https://github.com/sourcemeta/core/tree/main/src/core/json) and we constraint integers as signed 64-bit integers indeed. The problematic integer you have there, 9223372036854776000, is slightly over the I'm not sure what language you are using in general, but what makes this whole thing more confusing is that languages like JavaScript by default represent integers as IEEE 764 floating point numbers (https://stackoverflow.com/questions/9643626/does-javascript-support-64-bit-integers). That means that Node.js will "take" numbers like 9223372036854776000, but with loss of precision and very strange behaviour. For example:
As you can see, basic arithmetic is completely broken. In general, do you have a strong reason to do integers beyond the 64-bit range? If possible, it would recommend avoiding it, as most JSON parsers and JSON Schema validators out there will exhibit weird behaviour and pool interoperability, which is why the RFC constraints are in there! That said:
Let me get into the error message one, hopefully today |
Though actually... maybe it is not that hard to do 128-bit integers (given |
See: sourcemeta/jsonschema#235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See: sourcemeta/jsonschema#235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See: sourcemeta/jsonschema#235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See: sourcemeta/jsonschema#235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See: sourcemeta/jsonschema#235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See: sourcemeta/jsonschema#235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See: sourcemeta/jsonschema#235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See: sourcemeta/jsonschema#235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See: #235 Signed-off-by: Juan Cruz Viotti <jv@jviotti.com>
See v6.0.2 (https://github.com/sourcemeta/jsonschema/releases/tag/v6.0.2). That version at least improves the error message. I'll keep this issue open as I might play around with |
Hi @jviotti—thanks so much for the thoughtful follow-up! And the improved error message helps a ton.
Quite the small delta there! So just for context, I'll just provide some background on why I'm using the library and where this value is coming from. I'm writing some basic protobuf schemas for use with ConnectRPC, and then generating JSON schemas for use with OpenAI's structured outputs. The protobuf definition with the value is coming straight from Google's money.proto definition. This is then getting converted using Buf's proto to JSON Schema generator. I've isolated the issue to how Go treats marshaling the float64 cast of the largest 64-bit integer value. Check out the code here. Basically: a := uint64(1) << (64 - 1) // 9223372036854775808
fmt.Printf("%d", a)
b := float64(a)
fmt.Printf("%f\n", b)
c, _ := json.Marshal(b)
fmt.Printf("%s", c) Outputs:
And since there is no float type in JSON, it's just interpreted as a number and shows up as To work around this for now, I might just simply search and replace the "bad" value with the good one. |
Hi @dlo,
Sounds super cool. I would love to learn more about it at some point, mainly as I'm deep in the binary serialisation space (published papers touching on Protocol Buffers), and work a lot with JSON Schema ontologies, also touching on AI a bit. Would you be open to a virtual coffee chat to say hi sometime next week?
Sounds like you could be making a contribution to the converter, or at least fill an issue out? |
@dlo Total aside, but seeing your |
Hi all! Thanks so much for open sourcing this tool.
I noticed that the linter fails when checking a file with a large integer (9223372036854776000), even if it's valid JSON. I ran into this when trying to bundle several files, one of which included the one that failed the linter. Here's the file in its entirety (you'll want to remove the
// FAILS HERE
text, obviously):Invocation:
The text was updated successfully, but these errors were encountered: