-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Propshaft race condition with webpack creates 410 with dynamic chunk loading #110
Comments
Hello @joker-777, thanks for trying Propshaft and raising the issue. I'm not familiar with LiveReload, but from Propshaft side, it adds an extra As for production, |
Hi @brenogazzola! Thank you! I work with @joker-777 and we investigated this issue together. We would really appreciate if you can look into it, and we're also happy to try out any ideas. As we mentioned, reproducing it is quite tricky. As for production, yes the manifest keeps track of versions of assets and clears them on |
My two working ideas are using a semaphore when cache is reloading (which is going to have an impact on performance), or figuring out how to cache bust only the modified files (which makes code more complex). Need to find time to test each of them.
You mean from
It is however how |
I'm not sure about the more complex cache-busting, but I'd give semaphore a try, at least to see if it solves the issue we're seeing. I'll keep you posted if I find anything. That's a great idea! Thank you @brenogazzola. About the With "normal" assets, let's say With ".digested" assets, propshaft gets an already-digested file, and doesn't apply another digest to it, so we might produce, e.g. propshaft/lib/propshaft/output_path.rb Lines 55 to 60 in 3903815
This means that each Hope this makes sense? I'm not that familiar with propshaft yet, so I hope I didn't mess this up again :) |
Yep! adding a mutex around the cache cleanup seems to solve the issue (fingers-crossed). I have very limited experience with concurrency in general, and basically zero with Ruby concurrency, but I simply wrapped the cleanup code in a mutex and it seems to do the trick class Propshaft::LoadPath
# ...
def initialize(paths = [], version: nil)
@paths = dedup(paths)
@version = version
@mutex ||= Mutex.new
end
# ...
def clear_cache
@mutex.synchronize do
@cached_assets_by_path = nil
assets_by_path # seeds the cache
end
end |
@gingerlime wouldn’t it be safer to parse the |
@tsrivishnu This is an option but then it would only keep one version. Shouldn't it keep at least the versions of the previous deployment? |
@joker-777 I think it's doable with these two options:
|
I think it's worth noting that in the option 2, the manifest json that is versioned and kept is different from the |
We don't use capistrano. Versioning the manifest file could be a solution though. |
@gingerlime Could you try #110 and ensure that still works for you? |
Re: production accumulation of digest chunks, is this an issue because you're using Heroku or something? With a containerized deployment, like Kamal, each container should be starting with a fresh build. |
Thanks for following up on this. I’m no longer working on this codebase but @joker-777 should be able to comment on this issue. |
@dhh Thanks, I will try to test this as soon as possible. |
@dhh The link in your comment doesn't work, but I guess you meant version 0.9.0. We haven't seen this error for a long time. It may be because we don't use LiveReload anymore. But we will start using 0.9.0 from now on and let you know if we find any problems. |
I did mean 0.9, yes. If you're talking about cleanup of digested files, I don't know that there is a straight shot there. It's also not a problem that affects modern containerized deployments, since they'll have a clean build every time. |
Hi, first of all, thanks a lot for developing this new solution. We are using it in production and are quite happy. One issue we bump into quite frequently though is in development.
We use webpack and depend on dynamic chunk loading. That means that webpack needs to compile these chunks already with a hash and a ".digested" suffix. We also use LiveReload from webpack to refresh the page after each compilation automatically.
What we observed is that when we apply multiple changes in succession, then a dynamic chunk request might return a 410. We debugged the code and could see that the asset wasn't in the
@cached_assets_by_path
. We also saw that propshaft clears the cache when any of the files inapp/assets/builds
changes. We suspect that there is some kind of race-condition between webpack and propshaft, clearing the cache, compiling multiple times and refreshing the browser at the same time.We also noted that the digested assets created by webpack accumulate over time which could also slow down the creations of
cached_assets_by_path
. This accumulation happens also in production and doesn't get cleared byassets:clear
.We tried to find a reproducible example but since it is likely a race condition it is very hard. We would appreciate any thoughts and pointers.
The text was updated successfully, but these errors were encountered: