Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Low concurrent users cap with SSR #1840

Closed
nealalpert opened this issue May 1, 2017 · 9 comments
Closed

Low concurrent users cap with SSR #1840

nealalpert opened this issue May 1, 2017 · 9 comments

Comments

@nealalpert
Copy link

We were stress testing our next.js app and we're running into a rather low cap for the number of concurrent connections. For the initial GET request we're using next.js and express to serve up a server side rendered index page.

We're implementing some analytics at the moment to try and get a better idea about what our bottleneck is but I was wondering if there was anyone who has experienced a similar problem.

@sergiodxa
Copy link
Contributor

sergiodxa commented May 1, 2017

The server render is a synchronous task, because of that it block the event loop of node.js, you must run multiple instances of your next app to serve many users, you can also avoid rendering some parts to reduce the time it takes to server render and set a cache.

In the example folder there are examples of progressive render (avoid rendering some parts) and cache.

@nealalpert
Copy link
Author

Thank you very much for the quick reply! I think caching the SSR will be our next step.

@arunoda
Copy link
Contributor

arunoda commented May 1, 2017

Thanks @sergiodxa for the help.

@arunoda arunoda closed this as completed May 1, 2017
@nealalpert
Copy link
Author

Implementing server side render caching dramatically lowered our CPU usage on our VM as expected but didn't increase concurrency during testing.

@jancel
Copy link

jancel commented May 1, 2017

Is there any other advice to increase the concurrency? It seems that it just hugs around 5 requests per second, even if we throw 100 concurrent requests at it. Is this a nextjs side effect? It doesn't seem to fail at a higher clip, just doesn't process as much as we'd like to see.

@jancel
Copy link

jancel commented May 1, 2017

A little more about the environment. Caddy on Lightsail -> Amazon ELB -> Amazon Linux Node v4.0.0 -> Node Version 6.9.X

Utilizing LRU cache in the node app really increased the server performance (CPU hangs around 1%).

We are using Sticky Session, because of an issue with navigation on the opposite.

We are using Hey to test successfully. Apache Benchmark doesn't have tolerance and fails > 50%.

@sergiodxa
Copy link
Contributor

I think is not a side effect of next.js, it's more a side effect of React server render being synchronous. Maybe you can try using Preact for production which is supposed to render faster than React.

In my experience mounting a basic React server render you can take 200ms or more for rendering a normal page. So to improve that you can:

  1. render less things server side
  2. cache your server side rendered pages
  3. have more instances of your server and use a load balancer
  4. use Preact or Inferno to render faster

@jaspersorrio
Copy link

Hello @sergiodxa, just needed to clarify.

For:
4. use Preact or Inferno to render faster

Does this still stand with react 16?

Just curious and hope to know your opinions.

Thanks!

@davidgarsan
Copy link

davidgarsan commented Apr 15, 2018

Yep, recently I worked in a three ways benchmark with Vue v2.5, React 16.2 and Inferno 4, and Inferno was in another league. Other things like limited ecosystem aside, which is a drawback for big projects, Inferno was clearly the fastest one.

@lock lock bot locked as resolved and limited conversation to collaborators Apr 15, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants