Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skip tests when missing language interpreters, rather than failing them. #466

Closed
rfk opened this issue May 26, 2021 · 5 comments
Closed
Labels
get involved Good opportunity for deeper-dive onboarding

Comments

@rfk
Copy link
Collaborator

rfk commented May 26, 2021

As suggested in #464 (review), it's a little hostile to expect that everyone working on UniFFI will have all of the supporting software for all supported languages installed on their development machine in order to run cargo test. That's what we have CI for!

Instead of throwing an error when e.g. the Ruby interpreter or the Swift compiler is not available, can we do something similar to "skip test" so that it doesn't fail local builds? (And then in CI, require that no tests be skipped?).

┆Issue is synchronized with this Jira Task
┆Issue Number: UNIFFI-61

@rfk
Copy link
Collaborator Author

rfk commented Jun 3, 2021

It doesn't seem to be possible to dynamically decide to skip a test in Rust (e.g. you can't do a raise SkipTest like you could in Python) but Rust does have support for marking tests as ignored:

#[test]
#[ignore]
fn this_test_wont_run_by_default() {
    assert!(false, "should not be run by default");
}

Luckily for us, all the foreign-language tests are auto-generated via the build_foreign_language_testcases macro macro. What we could do, is have this macro inspect the system to see whether it's possible to run each of the target tests, and include the #[ignore] annotation automatically if it is not. It's kind of poor form for macros to inspect the system like that, but I think it would be fine in this case.

One nice thing about this approach is that in CI, we could run the tests with cargo test -- --ignored to force it to run any ignored tests, helping us make sure that the tests actually work in CI even if they don't run on individual dev machines.

@rfk
Copy link
Collaborator Author

rfk commented Jun 11, 2021

What we could do, is have this macro inspect the system to see whether
it's possible to run each of the target tests, and include the #[ignore] annotation automatically if it is not.

Concretely, we'd have to expose a new function for each language backend like can_run_tests or similar, that inspects the system and returns a boolean whether we can run the tests.

Then, we'd have to move some of the logic for figuring out which language backend corresponds to which test file out of the run-time code in uniffi::testing and into the build-time code in the build_foreign_language_testcases macro. This logic would be expanded to call the appropriate can_run_tests function and emit or omit the #[ignore] attribute as appropriate. Luckily it is already a procedural macro rather than a macro_rules! macro, because I don't think you could do that sort of conditional logic from the later.

We'd also want a simple in-memory cache of the results of calling can_run_tests, similar to how uniffi::testing remembers which crates it has already compiled.

@mhammond
Copy link
Member

Concretely, we'd have to expose a new function for each language backend like can_run_tests

Even in that scenario, a simple check that kotlinc works will succeed, but tests will fail unless I set CLASSPATH to some obscure value.

Another simpler option (albeit with worse developer ergonomics) is to have some kind of opt-out scenario. If the error messages noted how to opt-out, it might be a reasonable option.

@rfk
Copy link
Collaborator Author

rfk commented Jun 14, 2021

Even in that scenario, a simple check that kotlinc works will succeed, but tests will fail unless I set CLASSPATH to some obscure value.

Right; in practice I think it would have to actually try to execute a tiny sample program in order to tell whether running the tests would succeed.

Another simpler option (albeit with worse developer ergonomics) is to have some kind of opt-out scenario.
If the error messages noted how to opt-out, it might be a reasonable option.

I also wonder if splitting the language backends into their own crates (ref #299 (comment)) would help with this. Then if you don't want to run e.g. the Kotlin tests, you don't cargo test the uniffi-bindgen-kotlin crate. Not sure how that would work with our examples though.

@rfk rfk added the get involved Good opportunity for deeper-dive onboarding label Aug 4, 2021
@mhammond
Copy link
Member

I think we can declare this as fixed by #1031 - it uses an env-var instead of automatically trying to determine if working bindings exist, but does [ignore] tests and seems to work well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
get involved Good opportunity for deeper-dive onboarding
Projects
None yet
Development

No branches or pull requests

2 participants