Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0.4 schemas: add testing infrastructure for strict and base validation #92

Merged
merged 10 commits into from
Feb 7, 2022

Conversation

sbesson
Copy link
Member

@sbesson sbesson commented Feb 2, 2022

Follow-up of #87

This makes a minimal set of modifications allowing to test the JSON samples against both the base image schema and the strict image schemas.

  • fixes the $id and the $ref to use 0.4 and define an $id for the strict schema
  • adds the test_validation logic to load both schemas and use RefResolver to resolve the base schema
  • updates the test_validation tests to validate against both base and strict schemas for valid examples
  • rename the folders as valid and valid_strict for clarity (better suggestion welcome)
  • moves valid but not strict examples e.g. missing_name.json under the corresponding folder
  • remove the duplicate_axes_name.json sample - validating these types of constraints might require a more advanced framework

Copy link
Contributor

@constantinpape constantinpape left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@constantinpape constantinpape mentioned this pull request Feb 2, 2022
13 tasks
Copy link
Member

@will-moore will-moore left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if there is a place to put "Invalid files which we don't yet know how to test via the schema" such as the duplicate axes names example?
These could be useful (alongside the existing examples) to illustrate the spec rules, and also to highlight what isn't caught by the schema (and might need to be validated in other ways)?

0.4/tests/test_validation.py Show resolved Hide resolved
0.4/tests/test_validation.py Show resolved Hide resolved
@sbesson
Copy link
Member Author

sbesson commented Feb 3, 2022

I wonder if there is a place to put "Invalid files which we don't yet know how to test via the schema" such as the duplicate axes names example?

Yes on second thought removing this file loses some information as these are legitimately invalid sample files. When we decided to re-explore alternative validation frameworks, this will be useful to keep around to prove we can handle more advanced constraints.

I can see two ways to handle that:

  • either create a new invalid_* folder as you suggest to separate these samples from the other invalid samples which will fail the tests. My primary issue is that it strongly couples the examples/ directory layout to the JSON schema features
  • move the example under the invalid folder and have some logic to annotate these samples
    • by changing the file collections to use a more fine tuned glob, possibly creating a hard-coded list of files
    • by adding some internal field that would indicate whether the sample file should xfail the validation test
    • any other idea ?

@constantinpape
Copy link
Contributor

I can see two ways to handle that:

* either create a new `invalid_*` folder as you suggest to separate these samples from the other invalid samples which will fail the tests. My primary issue is that it strongly couples the `examples/` directory layout to the JSON schema features

* move the example under the `invalid` folder and have some logic to annotate these samples
  
  * by changing the file collections to use a more fine tuned `glob`, possibly creating a hard-coded list of files
  * by adding some internal field that would indicate whether the sample file should `xfail` the validation test
  * any other idea ?

I'd go with the simplest solution and just create a separate folder for these cases.

@sbesson
Copy link
Member Author

sbesson commented Feb 4, 2022

@constantinpape @will-moore the last set of commits should now keep the invalid examples which are not flagged by the JSON schema under a separate folder with associated xfail tests. The invalid examples are also tested both against the simple and strict validator. Anything else worth adding here or should we merge this?

@will-moore
Copy link
Member

I think it's good to merge. Are there any/many actual schema changes to update the strict schema for v0.4? I assume they'll come in a follow-up PR?

@sbesson
Copy link
Member Author

sbesson commented Feb 4, 2022

I don't have anything pending in terms of schemas addition. I would certainly to handle that as follow-up dedicated PRs.

Should we create a reference issue to capture these tasks? Capturing a list of things coming to my mind:

  • units which were removed from the base schema 95c20ed and should probably be reintroduced into the strict schema
  • schemas for plate and well specification
  • possibly extracting the omero specification as a sub-schema like axes

@sbesson sbesson merged commit 12acbb6 into ome:main Feb 7, 2022
@sbesson sbesson deleted the 0.4_schemas_strict_validation branch February 7, 2022 11:29
github-actions bot added a commit that referenced this pull request Feb 7, 2022
…t_validation

0.4 schemas: add testing infrastructure for strict and base validation

SHA: 12acbb6
Reason: push, by @sbesson

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
github-actions bot added a commit to sbesson/ngff that referenced this pull request Feb 7, 2022
…rict_validation

0.4 schemas: add testing infrastructure for strict and base validation

SHA: 12acbb6
Reason: push, by @sbesson

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants