Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Detect duplicate fields in the parser #86

Merged
merged 1 commit into from
Aug 24, 2020
Merged

Conversation

szeiger
Copy link
Collaborator

@szeiger szeiger commented Aug 24, 2020

  • The error message has a rather unhelpful position but this is consistent with the way other static errors are currently handled by Sjsonnet. The original Jsonnet parser does it better.

  • There are two different places in the Jsonnet codebase where duplicate fields are detected: Statically in the parser and dynamically in the VM. It's not clear to me without digging much deeper into their C++ implementation when the second one is supposed to kick in. There is no specification or test case for the dynamic detection.

Fixes #69

@@ -303,5 +303,8 @@ object ErrorTests extends TestSuite{
// | at .(sjsonnet/test/resources/test_suite/error.invariant.simple3.jsonnet:18:10)
// |""".stripMargin
// )
test("duplicate_fields") - check(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we put this in EvaluatorTests instead? Currently all the test cases in test_suite/ that are used in FileTests.scala and ErrorTests.scala are directly vendored from the google/jsonnet implementation, so it's best to keep our own custom additions separate so next time we update the vendored code we don't accidentally lose our additions

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

EvaluatorTests seems like a strange place for a parser test.

I also assumed that the test suite should be the original Jsonnet one, but then I saw that it had a few changes applied over time and it doesn't match the current Jsonnet test suite.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about ParserTests? Not much in there now, but if this is a parser test then it seems like it belongs

@lihaoyi-databricks
Copy link
Contributor

One small comment, but feel free to merge once resolved

- The error message has a rather unhelpful position but this is consistent with the way other static errors are currently handled by Sjsonnet. The original Jsonnet parser does it better.

- There are two different places in the Jsonnet codebase where duplicate fields are detected: Statically in the parser and dynamically in the VM. It's not clear to me without digging much deeper into their C++ implementation when the second one is supposed to kick in. There is no specification or test case for the dynamic detection.

Fixes databricks#69
@szeiger
Copy link
Collaborator Author

szeiger commented Aug 24, 2020

Test moved to ParserTests.

@szeiger szeiger merged commit 0eaea66 into databricks:master Aug 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Sjsonnet does not error when duplicate fields are present in an object
2 participants