I developed a react app that is hosted on amazon s3 as a static website. I am trying to improve the SEO of this webpage but when seeing the google search console I noticed that there are 404 errors on the routes, I clicked them and the browser was able to display them but if I made a get request using curl I get the 404 error, I saw the problem is because the client side rendering not able to locate the javascript needed to show the route so my question is if there is a way I can preload the routes of the site in order to be crawlable by google and other search engine without using gatsby or next.js.
Asked
Active
Viewed 16 times
0
-
thanks, but that does not solve my issue, there just solve to show a custom error page if a route is not found, but my routes exist the problem is that as them are on a virtual route of the SPA the crawlers can't get them but broweser can – user14398375 May 30 '23 at 14:28
-
1Your case matches that solution exactly. They take the 404 errors from the s3 bucket and instead serve `index.html` with a 200 status. That allows all the routes to resolve and not have 404 errors. – Stephen Ostermiller May 30 '23 at 14:39
-
please forgive me and thanks for replying, you are correct, this worked pretty well. – user14398375 May 31 '23 at 01:25