Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent out-of-order arrival of audio node settings #297

Closed
orottier opened this issue Jun 19, 2023 · 4 comments · Fixed by #311
Closed

Prevent out-of-order arrival of audio node settings #297

orottier opened this issue Jun 19, 2023 · 4 comments · Fixed by #311
Assignees

Comments

@orottier
Copy link
Owner

Some of the audio node settings are propagated to the render thread via message passing, e.g. OscillatorNode.set_periodic_wave.
Others are passed with atomics, e.g. OscillatorNode.set_frequency.

The atomic settings are applied the moment the node's AudioProcessor is called, while the settings that are set via the message bus are only consumed at the start of the render quantum.

This means there is a non-negligible chance that (for a single render quantum) only the frequency setting is changed for this snippet:

osc.set_periodic_wave(w);
osc.set_frequency(440.);

We should fix this by applying all settings via the message bus.
CC @uklotzde and #291 (review)

@uklotzde
Copy link
Contributor

I also noticed that each AudioParam/AudioParamProcessor pair are connected by individual channels? This is probably not what you want. Messages should be dispatched and applied sequentially in the same order as they are received among the AudioParamProcessors.

I also consider using unbounded channels for this purpose as inappropriate. For any for real world use case bounded channels are recommended. Moreover the receiver side of theses crossbeam channels cannot be used safely in a real-time context AFAIK.

The crossbeam channels seem to have serious unresolved issues: crossbeam-rs/crossbeam#821

@orottier
Copy link
Owner Author

I also noticed that each AudioParam/AudioParamProcessor pair are connected by individual channels?

This was indeed the original implementation. When I realized this would lead to out-of-order issues, I came up with a rather exotic fix by enveloping the sender and pass it through the normal control->render message bus:
https://github.com/orottier/web-audio-api-rs/blob/main/src/param.rs#L567

This is a rather terrible implementation I must admit and it should be perfectly possible to drop the crossbeam channels that connect the AudioParam and AudioParamProcessor. However, this can be a bit tricky because the concrete types of the AudioProcessors are erased in the Graph object so AudioParamProcessors are not distinguishable from other processor.

I also consider using unbounded channels for this purpose as inappropriate. For any for real world use case bounded channels are recommended. Moreover the receiver side of theses crossbeam channels cannot be used safely in a real-time context AFAIK.

This is definitely true. We should choose appropriate bounded channels for all online audio processing. I will create a separate issue

@uklotzde
Copy link
Contributor

This is a rather terrible implementation I must admit ...

No offense, "make it work" is the appropriate strategy for getting started. Almost everything could be fixed later.

@uklotzde
Copy link
Contributor

Leaving (TODO) comments with your thoughts, assumptions, and any shortcomings you are aware of when writing code would help others (or your future self) to catch up when needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants