-
Notifications
You must be signed in to change notification settings - Fork 469
Description
Currently the buffer size varies greatly between different backends. E.g. coreaudio seems to consistently provide buffers of 512 frames, whereas the alsa and wasapi backends can vary greatly between 250 and 1000+ frames.
Some backends provide an API for requesting buffers of a specific size. This can be useful for tweaking the trade-off between cpu usage and low-latency (often useful in pro-audio applications).
This could possibly be a 3rd argument to the build_{input/output}_stream methods, e.g.
pub enum BufferSize {
Default,
Fixed(usize),
}
let input = event_loop.build_input_stream(&device, &format, BufferSize::Default).unwrap();
let output = event_loop.build_output_stream(&device, &format, BufferSize::Fixed(128)).unwrap();Alternatively perhaps a buffer_size field can be added to the Format type?
From what I can remember not all backends or devices can guarantee that they can provide buffers of arbitrary, fixed sizes, so we'd either have to emphasise that this is a "best-effort" API or add an intermediary buffer so that we could manually provide buffers of a specified fixed size in the case that a backend or device cannot.