Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WinML backend for wasi-nn #7807

Merged
merged 21 commits into from
Feb 29, 2024
Merged

WinML backend for wasi-nn #7807

merged 21 commits into from
Feb 29, 2024

Conversation

jianjunz
Copy link
Contributor

This change adds Windows Machine Learning backend for wasi-nn. Since WinML is a built-in feature of Windows, this change makes it possible to run wasi-nn on Windows without a third-party machine learning framework.

WinML backend is only available on Windows.

WinML: https://learn.microsoft.com/en-us/windows/ai/windows-ml/

@jianjunz jianjunz requested review from a team as code owners January 24, 2024 06:50
@jianjunz jianjunz requested review from pchickey and removed request for a team January 24, 2024 06:50
@jianjunz jianjunz marked this pull request as draft January 24, 2024 08:43
abrown added a commit to abrown/wasmtime that referenced this pull request Jan 30, 2024
Like the rest of the `windows-*` crates published by Kenny Kerr, this
change also adds the `windows` crate itself to the trusted list. This is
necessary for use in bytecodealliance#7807.
github-merge-queue bot pushed a commit that referenced this pull request Jan 30, 2024
Like the rest of the `windows-*` crates published by Kenny Kerr, this
change also adds the `windows` crate itself to the trusted list. This is
necessary for use in #7807.
@abrown
Copy link
Contributor

abrown commented Jan 30, 2024

@jianjunz:now that #7846 is merged you should be able to rebase on that and get cargo vet to pass.

crates/test-programs/Cargo.toml Outdated Show resolved Hide resolved
crates/wasi-nn/Cargo.toml Show resolved Hide resolved
crates/wasi-nn/src/backend/winml.rs Outdated Show resolved Hide resolved
crates/wasi-nn/src/backend/winml.rs Outdated Show resolved Hide resolved
crates/wasi-nn/src/backend/winml.rs Show resolved Hide resolved
#[cfg(feature = "openvino")]
let mut backend = backend::openvino::OpenvinoBackend::default();
#[cfg(feature = "winml")]
let mut backend = backend::winml::WinMLBackend::default();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This won't work when both features are enabled. Features are additive, so both backends can be present at the same time.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We run openvino and winml backend in two different test cases, with different inputs, so only one backend is specified here for testing purpose.

backend.list() is able to list all available backends.

pub fn list() -> Vec<crate::Backend> {
let mut backends = vec![];
#[cfg(feature = "openvino")]
{
backends.push(Backend::from(OpenvinoBackend::default()));
}
#[cfg(feature = "winml")]
{
backends.push(Backend::from(WinMLBackend::default()));
}
backends

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I mean is: if someone runs cargo test --all-features, both the OpenVINO and WinML statements that assign to let mut backend will be present in the code. Some background: this all.rs file is compiled once by cargo test into a single binary that runs multiple tests, so that compilation will be incorrect because the second let mut backend will shadow the first.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the background. It's fixed.

#[cfg_attr(not(all(target_arch = "x86_64", target_os = "windows")), ignore)]
#[test]
fn nn_image_classification_winml() {
run(NN_IMAGE_CLASSIFICATION_WINML, true).unwrap()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You will want to double-check that this test is actually run. Right now in Wasmtime's CI, only a subset of jobs run to preserve runner time. The Windows "test" job does not run by default but if you add the string prtest:full in any commit, then all jobs will run. Then we can check the logs for the Windows job to ensure this test actually ran.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With prtest:full, this test failed on Windows Server. I guess GitHub Windows Server runner doesn't have Desktop Experience enabled.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm afraid we need to disable this test at this time. GitHub Actions Windows Server image doesn't have Desktop Experience available, so it cannot be installed.

List of installed or available features:
https://github.com/jianjunz/wasmtime/actions/runs/7752997447/job/21143496106#step:9:1

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that's unfortunate. Why don't we add a function like check_openvino_is_installed but for WinML: check_winml_is_available. There's probably a way to check with the Windows API if ML system calls are possible and that function could return false if they are not.

That extra runtime check would be in addition to the #[cfg_attr(not(feature = "winml"), ignore)]. That can stay there for the case where the winml feature isn't even enabled.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

check_winml_is_available is added. It returns ok for my Windows 11, but I'll need to find an environment to check if it returns an error when WinML is not available.

@jianjunz jianjunz marked this pull request as ready for review February 5, 2024 04:51
abrown added a commit to abrown/wasmtime that referenced this pull request Feb 8, 2024
In bytecodealliance#7846 I added `cargo-vet` entries to trust the `windows` crate, just
like we already trust several related `windows-*` crates. I did not,
however, update the lockfile, which means that bytecodealliance#7807 continued to fail
the `cargo vet --locked` CI check. This change is the result of simply
running `cargo vet`.
github-merge-queue bot pushed a commit that referenced this pull request Feb 8, 2024
In #7846 I added `cargo-vet` entries to trust the `windows` crate, just
like we already trust several related `windows-*` crates. I did not,
however, update the lockfile, which means that #7807 continued to fail
the `cargo vet --locked` CI check. This change is the result of simply
running `cargo vet`.
@jianjunz
Copy link
Contributor Author

Thanks for auditing windows-* crates, but cargo vet still fails after #7900. Only 1 unvetted dependency now.

1 unvetted dependencies:
  windows:0.52.0 missing ["safe-to-deploy"]

Copy link
Contributor

@abrown abrown left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jianjunz, this looks good to me with a couple of final points:

  • let's resolve the shadowing for the cargo test --all-features case
  • let's add the check_winml_is_available dynamic check
  • can you rebase on main? If it's not too much trouble, I might suggest squashing some of the commits since the entire commit history will get dropped into the merge commit (but this is a personal preference thing: feel free to leave as-is).

@jianjunz
Copy link
Contributor Author

jianjunz commented Feb 28, 2024

The code is rebased. No conflict. However, cargo vet reported more unvetted dependencies. I'll check if there is any unnecessary change for cargo.lock.

@abrown abrown added this pull request to the merge queue Feb 29, 2024
Merged via the queue into bytecodealliance:main with commit 10a9f9e Feb 29, 2024
43 checks passed
@jianjunz
Copy link
Contributor Author

jianjunz commented Mar 1, 2024

Thanks for your review and merging this PR.

@jianjunz jianjunz deleted the winml branch March 1, 2024 00:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants