Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache module instantiations per-thread #21

Merged
merged 1 commit into from
Oct 31, 2019
Merged

Conversation

alexcrichton
Copy link
Collaborator

This commit continues to rejigger the API of watt by having a
top-level Instance type now instead of a suite of top-level functions.
By using a top-level struct we can store an internal identifier
(usize) which is used to key a thread-local cache for wasm blobs. This
should allow us to share resources like module instantiations across
macro invocations, ensuring that we only instantiate modules
once-per-process.

Closes #16

@alexcrichton
Copy link
Collaborator Author

Another alternative could be:

watt::instantiate_wasm!(WASM, "path/to/foo.wasm");

or

static WASM: watt::Instance = watt::instantiate_file!("path/to/foo.wasm");

Copy link
Contributor

@mystor mystor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This does limit the API a bit, in that it makes it harder to dynamically instantiate modules. It's probably ok, though, as the crate has a limited intended use-case. I'd have to modify something like crts quite a bit to get it working, but we shouldn't be optimizing for that use-case anyway.

exec::proc_macro(fun, vec![args, input], wasm)
fn id(&self) -> usize {
static NEXT_ID: AtomicUsize = AtomicUsize::new(1);
match self.id.load(SeqCst) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm fairly confident the orderings in this method can be Relaxed, as they aren't providing any ordering guarantees for memory other than themselves.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I personally prefer to use SeqCst everywhere unless performance becomes an issue, and I'm pretty certain performance is not an issue here.

/// ```
pub fn proc_macro(fun: &str, input: TokenStream, wasm: &[u8]) -> TokenStream {
exec::proc_macro(fun, vec![input], wasm)
pub struct Instance {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It'd probably be good to have a doc comment on the Instance struct itself, explaining that it's intended to be used in a static.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

This commit continues to rejigger the API of `watt` by having a
top-level `Instance` type now instead of a suite of top-level functions.
By using a top-level `struct` we can store an internal identifier
(`usize`) which is used to key a thread-local cache for wasm blobs. This
should allow us to share resources like module instantiations across
macro invocations, ensuring that we only instantiate modules
once-per-process.

Closes dtolnay#16
@dtolnay dtolnay merged commit b267a6e into dtolnay:master Oct 31, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Cache wasm module instantiations across macro invocations
3 participants