-
Notifications
You must be signed in to change notification settings - Fork 27.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Introduce a simple prefetching solution #957
Conversation
lib/prefetch.js
Outdated
|
||
PREFETCHED_URLS[url] = fetch(url) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could try to do this after the load
event.
When we reach this page, we already passed DOMContentLoaded
.
Since we don't do any processing, this should fine.
This is related to #956. 🤖 |
@sarukuku added to the introduction text so it's linked 👍 |
@rauchg I did a load test. Basically, prefetch a lot of pages in a setInterval. This is the Chrome's timeline profiler view. Basically, it spend a pretty small(micro-secs) time to send the request and get it. |
lib/router/router.js
Outdated
const { Component, err, xhr } = routeInfo.data = await this.fetchComponent(route) | ||
const ctx = { err, xhr, pathname, query } | ||
const { Component, err } = routeInfo.data = await this.fetchComponent(route) | ||
const ctx = { err, pathname, query } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why we changed ctx
? I remember we use xhr
for on error page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reason is the prefetched version doesn't have a xhr
object. It'll see more about this.
lib/prefetch.js
Outdated
return (typeof navigator !== 'undefined' && navigator.serviceWorker) | ||
} | ||
// Add fetch polyfill for older browsers | ||
import 'whatwg-fetch' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder importing this module is ok since it would be loaded on server too ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
isomorphic-fetch
maybe?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder how this will affect the limit of concurrent HTTP requests on browser. (Is there the limit if we use service worker ?) |
@nkzawa I think I need to find more about that. |
By the way, if we're going to ditch the SW, I'm ok with prefetching by default and ditching |
@rauchg I am not sure about that. |
Does ditching the service worker cost us any gains/losses vs using the service worker? |
We think this only has upsides so far |
@nkzawa actually it depends. Usually, it's around 4-8 connections per origin. So, yes there's an issue with that. (But it applies for SW as well) But with HTTP2 there won't be such problem because of multiplexing. But that'll affect the time to other resources in that origin. So, here's my idea.
|
Even with HTTP2 multiplexing you have some device dependent limits on the number of concurrent requests, but it goes much higher. It can become limited by the memory allocated for the responses, somewhere around 25mb of pre-allocated header memory can easily cause problems even on desktop browsers. But with that you can easily afford 100+ concurrent requests, which should cover most needs. If you need more, it makes sense to use websockets and keep track of requests and responses in js. These limits should apply both to service workers and other http requests. Also, when running out of your per-process memory, you will get e.g. net::ERR_INSUFFICIENT_RESOURCES in Chrome/ium. Anyway, throttling prefetching at 2 concurrent requests is probably a sane default, considering mobile. |
lib/prefetch.js
Outdated
if (this.props.prefetch !== false) { | ||
prefetch(href) | ||
if (!linkPrinted) { | ||
const message = '> You are using deprecated "next/prefetch". It will be removed with Next.js 2.0.\n' + |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nkzawa I suggested to drop this on 2.0 final release. I hope that's okay.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like to @arunoda
I think |
lib/router/router.js
Outdated
@@ -185,7 +194,7 @@ export default class Router extends EventEmitter { | |||
|
|||
try { | |||
const { Component, err, xhr } = routeInfo.data = await this.fetchComponent(route) | |||
const ctx = { err, xhr, pathname, query } | |||
const ctx = { err, pathname, query, xhr } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should rename xhr
if we replace it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah. That's on my todo list.
How about "jsonPageRes"
@nkzawa Yep. With fetch we can't really abort. My suggestion is we don't need to abort. Usually, it's very rare xhr abort to happen. Even we do that, it might have reached the server. Since we also do prefetching and has aggressive caching this won't be a problem. |
@nkzawa This is ready. |
#### With `<Link>` | ||
|
||
You can substitute your usage of `<Link>` with the default export of `next/prefetch`. For example: | ||
You can add `prefetch` prop to any `<Link>` and Next.js will prefetch those pages in the background. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was thinking prefetching is the default behavior. What's the reason ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to do that? I think it's better users turn it on.
So, we have no surprises.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 makes sense
lib/prefetch.js
Outdated
let { pathname } = urlParse(href) | ||
const url = `/_next/${__NEXT_DATA__.buildId}/pages${pathname}` | ||
let apiPrinted = false | ||
let linkPrinted = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We'd like to use execOnce
in ./util
instead of having each flag.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll use it.
// contain a map of response of prefetched routes | ||
this.prefetchedRoutes = {} | ||
this.prefetchingLockManager = new LockManager(2) | ||
this.prefetchingRoutes = {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if we can simplify this state managements by having just Promise of fetch
?
Additionally, I think state of prefetch and normal fetch results can be managed by an object.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, we can't simplify. But we could move the isolate the fetching logic and move it to some other module. Since we are fetching in parallel, we need to keep these states.
Do you wanna move the fetching logic to somewhere else?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe that's because of LockManager
? Then I will try to refactor after this was merged.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nkzawa even with it or without it, since we are fetching parallel we've to do it.
In the previous version, we hadn't that check. So, it was possible to fetch the same page multiple times.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me give more about these states:
- prefetchedRoutes - Store already prefetched routes
- prefetchingRoutes - Keep track of currently prefetching routes. So, when we asked again to prefetch the same route(which it's prefetching), it won't happen.
- prefetchingLockManager - Make sure, we only fetch two routes at once.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we shouldn't fetch the same page multiple times ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nkzawa we don't. But user could ask to do that. That's why we need to keep these states.
For an example, let's say user asking us to fetch following pages:
<Link prefetch href="/abc?aa=10" as="/abc/10" />
<Link prefetch href="/abc?aa=20" as="/abc/20" />
...
But we only need to fetch /abc
once. That's why.
lib/utils.js
Outdated
@@ -53,3 +53,31 @@ export async function loadGetInitialProps (Component, ctx) { | |||
} | |||
return props | |||
} | |||
|
|||
export class LockManager { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can use https://github.com/timdp/es6-promise-pool
or other existing modules. I feel the idea of lock doesn't fit to JS :D
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like to use an existing lib. But the promise pool and others similar doesn't work. They are good, if we know the amount of jobs we've do.
In our case, we don't know how many prefetches we need and when they arrive. So, what we need is a continuously running job manager with a max cap for parallel jobs.
Frankly, I couldn't find a something else.
Anyway, I like locks and would love to use them with JS.
Unlike other languages, memory accessing is thread safe :) So, I love it.
Anyway, if you know a lib which does what we want, let's use it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
es6-promise-pool
looks you don't need to know the amount of jobs? And you can just recreate a pool if all existing jobs done.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nkzawa But then we've to wait all the jobs getting done no? Usually we are getting requests by one by one.
Then we may need to do a setTimeout and check all the prefetches happening in the same event loop and so on.
But I thought this was nice than that.
Because, we no longer use it.
Let's take this and keep iterate on this. |
Basically this is a universal solution which works everywhere.
We simply invoke a XHR for all the prefetching URLs.
But we don't parse JSON until the it's being used.
Fixes #956