Also have a look at ISR (Incremental Static Generation)
If the concern is that the server would become very busy rendering and fail to respond quickly during peak usage, this could help. You could just do ISR for both crawlers and users in the exact same way, which would also reduce the complexity of the system.
I have described how to use it at What is the difference between fallback false vs true vs blocking of getStaticPaths with and without revalidate in Next.js? but basically you need to return from getStaticPaths
:
{
fallback: true, // or 'blocking'
revalidate: <integer>
}
and then the server will render at most once every revalidate
seconds and save that prerender. If multiple requests come before the revalidate
timeout, the page will not re-render (and therefore potentially contain stale data, but you can't have everything right).
Then, to handle user specific data (we only ISR stuff for the content of a logged out user, because we can only factor such common requests out, and not user specific ones), you can make an API request after initial load, and return only user specific data to update the "user logged out page" version with it.
Here's an example. That is an implementation of gothinkster/realworld, which is basically a tiny Medium clone. When you visit for example an article, users will see a button saying "like this article"/"unlike this article", so the client needs to know the user specific data "do I like this or not?". So in my implementation I just ISR the hole page, and return only this user-specific data (OK, to be honest in that example I'm returning full article data to overcome "I don't know how to nicely make users immediately see updated blog posts after edits").
With all that said, I doubt there will be a noticeable performance/server usage difference between SSR and ISR for clients at only thousands of requests per day. These things can only be ultimately answered by benchmarking.