You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Similar to the new Netlify preview, we could have a DUB mirror that updates itself daily (Travis cron) and is at the end of the chain and the last resort.
The idea: prefetch all responses with wget/curl and deploy them to a CDN (GitHub Pages, Netlify, ...).
Netlify can handle a few GBs quite fine (the docarchives.dlang.io currently take 3G), but I doubt that it
will be that much. Currently the uncompressed JSON of all packages is 38M
Netlify does support redirects, so the zips don't need to be downloaded.
Downsides/problems:
search won't work (ok, because this is last resort, s.t. building always works)
The text was updated successfully, but these errors were encountered:
I'd rather argue for migrating the registry to properly hosted servers and a simple HA setup instead of complicating the architecture with such a strong requirement (or uglifying it with a hack).
We could possibly funnel through cloudfront instead of using curl/wget, but I have bad experience with long 503s with unreliable backend servers.
In fact I'm preparing migration to a couple of VPS instances atm., let's try the proved approach first.
Similar to the new Netlify preview, we could have a DUB mirror that updates itself daily (Travis cron) and is at the end of the chain and the last resort.
The idea: prefetch all responses with wget/curl and deploy them to a CDN (GitHub Pages, Netlify, ...).
Netlify can handle a few GBs quite fine (the docarchives.dlang.io currently take 3G), but I doubt that it
will be that much. Currently the uncompressed JSON of all packages is 38M
Netlify does support redirects, so the zips don't need to be downloaded.
Downsides/problems:
The text was updated successfully, but these errors were encountered: