5

Site actually serves tens of thousands users per day, and we are planning to add SSR to appear in google (unbelievable, I know) and to have a preview when sharing in messengers.

Currently it's client side react app, and I'm worrying how the server would feel after migration to next.js.

Detecting bots should not be a problem: by user agent.

What is the most optimal solution?

Maybe nginx will modify url parameters depending on user agent like '/page' for users and '/page@ssr' for bots, there will be pages/page.js without getInitialProps for users and pages/page@ssr.js for bots, these two files will fetch data in different ways and will render the same component with this data. What do you think?

user2541867
  • 344
  • 4
  • 12
  • 3
    sounds like a pretty odd requirement. Probably make sense to cache the SSR response and run some client side logic to generate dynamic content for each user? – Andrew Zheng Sep 30 '20 at 13:15
  • what you are suggesting would probably minimise bandwidth depending on the nature of your site. if you always serve a small SPA page to users, and serve the full (larger) html pages for bots. then populate the SPA with API data, which is also minimal. and as andrew says - put a cache in front of the server to minimise its load. – mulllhausen Sep 20 '21 at 05:01

2 Answers2

1

Also have a look at ISR (Incremental Static Generation)

If the concern is that the server would become very busy rendering and fail to respond quickly during peak usage, this could help. You could just do ISR for both crawlers and users in the exact same way, which would also reduce the complexity of the system.

I have described how to use it at What is the difference between fallback false vs true vs blocking of getStaticPaths with and without revalidate in Next.js? but basically you need to return from getStaticPaths:

{
  fallback: true, // or 'blocking'
  revalidate: <integer>
}

and then the server will render at most once every revalidate seconds and save that prerender. If multiple requests come before the revalidate timeout, the page will not re-render (and therefore potentially contain stale data, but you can't have everything right).

Then, to handle user specific data (we only ISR stuff for the content of a logged out user, because we can only factor such common requests out, and not user specific ones), you can make an API request after initial load, and return only user specific data to update the "user logged out page" version with it.

Here's an example. That is an implementation of gothinkster/realworld, which is basically a tiny Medium clone. When you visit for example an article, users will see a button saying "like this article"/"unlike this article", so the client needs to know the user specific data "do I like this or not?". So in my implementation I just ISR the hole page, and return only this user-specific data (OK, to be honest in that example I'm returning full article data to overcome "I don't know how to nicely make users immediately see updated blog posts after edits").

With all that said, I doubt there will be a noticeable performance/server usage difference between SSR and ISR for clients at only thousands of requests per day. These things can only be ultimately answered by benchmarking.

Ciro Santilli OurBigBook.com
  • 347,512
  • 102
  • 1,199
  • 985
-1

If you're using a custom express server you can try the SSR For bots solution I implemented for the same case as you!

motis10
  • 2,484
  • 1
  • 22
  • 46
  • I tried to use it. Your module uses prettier. I put all the libraries on Linux WSL for prettier. But at the end I got error TimeoutError: Navigation timed out 25000ms for UserAgent "Googlebot" – Ruslan Novikov Jun 10 '22 at 12:59