Search engine bots (google, facebook, etc) usually do not allow javascript when they scrape/crawl a url in order to create a preview (with title, description and an image). This means that the frontend (react) itself is not enough to dynamically set the head tags using javascript. This means that the page should be server side rendered before served, in order to apply the head tags. But, I was thinking instead of ssr, when I need to have new tags depending on the page, to create a new html file. So, I would have index.html and product1.html and product2.html etc. But I don’t know if this is the correct way. If I use SSR, I am then loosing the ability to serve my frontend in a static storage because I will also need a server for the frontend. But if I create a new html file for each new entry I want to have html head tags, my frontend will be filled up with too many html files.
Is there a better solution where I can statically serve the frontend and have different html head tags depending on the url ?
Can I just have an index.html which has a head tag that represents all the sitemap of my application, so the bot will know which are the correct tags for the url is trying to crawl ?