-
Notifications
You must be signed in to change notification settings - Fork 339
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HTTP add stream support #2479
HTTP add stream support #2479
Conversation
Thanks for opening a PR! Tiny bit more context for reviewers: https://discord.com/channels/616186924390023171/1344666371316908063 This is supposed to be an alternative to #2140 @adrieljss Can you please revert the changes to Cargo.toml, Cargo.lock, package.json, and pnpm-lock.yaml ? Versioning is handled in CI and this would break stuff. |
Package Changes Through 194fdeeThere are 4 changes which include http with minor, http-js with minor, log with minor, log-js with minor Planned Package VersionsThe following package releases are the planned based on the context of changes in this pull request.
Add another change file through the GitHub UI by following this link. Read about change files or the docs at github.com/jbolda/covector |
plugins/http/src/commands.rs
Outdated
#[derive(Clone, Serialize)] | ||
pub struct StreamMessage { | ||
value: Option<Vec<u8>>, | ||
done: bool, | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not performant because of serialization overhead.
To avoid this, you can send a tauri::Response
which is just a Vec<u8>
and then you could append a single byte at the end that is either 1 or 0, indicating the status of done. Or even an empty array would suffice.
I am not sure if the channel is as performant as invoke (will have to check that later) but would be better to avoid using the channel.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes you're right, I will change the struct, how about using serde_bytes? Since serde could be serializing the Vec as an array of numbers.
I don't think there's any way to do streaming using invoke(?) so maybe a solution would be having a seperate function for seperate usecases, one for streaming and one for collecting all the contents at the end, but the user has to specify that and that will not be fetch
-like.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Though, if the server doesn't want to stream, the channel will only receive 1 chunk and that should be (around) the same performance since both uses IPC anyways.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this can be done with invoke
and an AsyncIterator
in JS:
- The removed
fetch_read_body
instead of using.bytes()
it will use.chunk()
, effictively returning a single chunk each time it is called - On JS side, you can create an
AsyncIterator
that will be used to create theReadableStream
, on each iteration it will callfetch_read_body
and append the data to the stream
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would that be better though? In channels, it said that is the recommended way for streaming data (like HTTP streaming, which is our case) to the frontend. I think the invoke
approach will block operations until it has returned something. While channels
is just an event listener.
In my opinion, I feel like it's more intuitive rather than spamming invokes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The channel
calls invoke
under the hood. Channels are more ergonomic but there is room for improvement in their implementation as it uses tauri::State
with a Mutex
unlike what I am proposing which would use a Mutex only until the request is sent, after that the body could be read without any Mutexes.
For now, let's use the channel approach and I can improve on it later (once I confirm that there is mutex overhead involved).
My main blocker for this PR is the use of serialization which we can avoid even with channels.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes okay, I will fix that 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@amrbashir can you check the new commit I did? I can't use tauri::Response
inside the channel because it doesn't impl Clone and the compiler doesn't allow me to. I used serde_bytes to try and make it as efficient as possible, instead of just sending an array of numbers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That clone bound shouldn't be needed, I will open a PR to fix it later.
Edit: here it is tauri-apps/tauri#12876
For now, you can use InvokeResponseBody::Raw
which is Clone
and implements IpcResponse
trait and so can be used with channels.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That clone bound shouldn't be needed, I will open a PR to fix it later.
For now, you can use
InvokeResponseBody::Raw
which isClone
and implementsIpcResponse
trait and so can be used with channels.
done! 👍
plugins/http/src/commands.rs
Outdated
// reqwest::Response is never read, but might be needed for future use. | ||
#[allow(dead_code)] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's remove, no need to keep it around anymore
plugins/http/guest-js/index.ts
Outdated
|
||
const readableStreamBody = new ReadableStream({ | ||
start: (controller) => { | ||
streamChannel.onmessage = (res: Uint8Array) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should be an ArrayBuffer
iirc, and sometimes in the case where IPC fails to use the borwser fetch
, it fallsback to Json serialization so this would be Array<number>
So we need to do some checks here, similar as what was done before
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, you're right. I'll update that.
@amrbashir I added a new commit, can you check? I also fixed the content conversion bug said above, by converting the |
I also removed ReqwestResponse completely acafdd8, it's never read on Rust and is not needed in guest-js. |
LGTM, now you just need to run |
that's out of the plugin scope users will need to add check and fallback if they wish so. |
looks like you need a final |
Really appreciate the great work! Any plans to release this feature soon? |
good job guys, when do you think this will be released ? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@amrbashir i'll approve and merge this one, it works well
i think any further change should come after a change in tauri core too right?
This PR was meant to add stream support for
fetch
.What was changed:
Rust:
fetch_read_body
, because we're already sending response body chunks through the IPC.reqwest::chunk
to send an array of bytes (Vec<u8>
) to JavaScript using the IPCChannel
JavaScript:
ReadableStream
asInitBody
inResponse
, this should be automatically collected and function as normal even if not used for streaming responses.Example Usage
or (TypeScript might not like this)
closes #2140