-
Notifications
You must be signed in to change notification settings - Fork 352
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation of Loading-Bar functionality on the Web #338
Comments
Hi! I've tried your approach, few notes:
// Inside example.html
import init from './{{ page.title }}.js';
const originalFetch = window.fetch;
window.fetch = async function progressive_fetch(resource) {
// your code which uses `originalFetch`
};
init();
My impression is that all this information must come from the engine (Rust => JS). Also, how do we know how many assets will load? Etc. Another issue is the loading UX, for know I've done this, which is… meh. 👇 bevy-example-progress-bar-01.mp4 |
I really appreciate you taking a look at this!
I explored the monkey patch route originally, and it's why I wrote However, I'm not sure whether we want asset-loading to necessarily use Plus, given that we need to inject the As far as injecting the code into the examples.. I don't know if there is a good way for that either, but our deployment script already makes use of Given the above context, idk. Overwriting
I suspected this would be the case, but it seems like something the engine needs to deal with. Possibly the |
This is a proposal for #338. - Builds on top of the code by @ickk using `TransformStream`. - Adds loading feedback on examples. - The solution is hacky, it monkey-patches `fetch` in the example template. https://user-images.githubusercontent.com/188612/163875627-b11bf330-0fb8-4991-a710-e12e8657fe07.mp4 Co-Authored-By: ickk <17050131+ickk@users.noreply.github.com> Co-authored-by: ickk <git@ickk.io>
This issue intends to address issues raised in #236, with some concrete design discussions.
Context
Loading bevy on the web can take quite a long time depending on the user's network connection, and the size of the files involved.
Currently on the bevyengine website some of our example pages can take a very long time to load the main wasm file & display the canvas, and longer still to load the assets required - leaving the dreaded grey box linger. To a user this can seem like something is broken or frozen; they may refresh or click away before the page finishes loading. This is bad UX and clearly undesirable.
Concerns
There are two distinct items of concern:
Tracking the loading status of assets after the wasm module is running should be possible from inside rust/bevy, and might be possible to handle well with some modifications to
AssetServer
in a platform independent way.However, tracking the loading status of the main wasm module is obviously not possible from within rust/bevy, as bevy's logic is contained within the wasm module itself! Therefore we need a javascript solution.
Investigation
On the web platform both the wasm module and subsequent asset downloads are handled through the browser's
fetch
API.A
Response
is returned as soon as thefetch
has received the headers, butResponse
does not provide a method to easily get at the current progress of the body (% of data actually loaded).The
response.body
is aReadableStream
(part of the Web Streams API). AReadableStream
can only have one reader at a time (returned by the.getReader()
method), and the data from aReadableStream
can only be read once.The implications from this is that a
Response
is effectively a single use object, so we can not simply read the stream to count the data as it is loaded and then pass the sameResponse
on to the caller.The Web Streams API also specifies a
.pipeThrough()
method onReadableStream
.pipeThrough
takes aTransformStream
, which allows us to cleanly place a bit of code that can access the chunks as they are streamed through and simply pass that data (or a transformed version of that data) on to the receiver.This would make it extremely ergonomic to extend the behaviour of fetch while providing pretty much the same API to consumers of the response.
It would be possible (but likely messier) to get some similar behaviour by either calling
readableStream.tee()
orresponse.clone()
. There is also a proposal for aFetchObserver
feature, but it has been stale for quite a long time, and it's WIP was removed from FF last year.Implementation
Our implementation of
progressive_fetch
lets the user specify 3 callbacks:Crucially the result of a
progressive_fetch
behaves identically tofetch
as far as the consuming code is concerned.Compared to a regular call to fetch:
Using progressive fetch might look like:
This is extremely customisable and ergonomic. Changing the exact behaviour and style of the loading bar is easy.
Integration
For our purposes, we need to replace the calls to
fetch
in the generated example.js files with a call toprogressive_fetch
. There may be a 'correct' way to do this, but we could fall back to sed if this is not easy:We obviously also need to provide a design for the loading bar itself, and add the relevant html and css to the template.
Problems
While this is in my opinion the cleanest way to provide this functionality, a big problem with using the Web Stream API is that Firefox does not fully implement the spec. Most notably,
pipeThrough
is missing. All other major browsers seem to support the functionality we need (except of course Internet Explorer, which doesn't support Web Assembly anyway).There is a polyfill based on the WHATWG reference implementation.
Further Discussion
We need to determine whether Bevy's
AssetServer
can easily pull the information from theResponse
object of a regularwindow.fetch
required to provide download-progress of assets in the engine.An alternative would be to use switch to
progressive-fetch
in this case as well, and all the AssetServer would need to do is copy the value of the progress to a value it can track, however then users of bevy that deploy to the web themselves would need a copy ofprogressive_fetch
.I am not a web programmer by trade, so if there are more appropriate ways to implement this I would be interested to hear your feedback.
The text was updated successfully, but these errors were encountered: