Ok, so I am a little confused on how to get Google (and other search engines) to know about and index my website. The website is a simple front-end React App.
According to my research, it seems like these are the steps I must take:
- Create a
sitemap.xml
in the/public/index/
folder. - Submit the
sitemap.xml
to Google here: Google Search Console - And that is it?
So this is what I currently have:
I have my site hosted in AWS Amplify. If I navigate to my site's robots.txt
file like this:
https://www.example.com/robots.txt
Then this is what the robots.txt
file looks like:
# https://www.robotstxt.org/robotstxt.html
User-agent: *
Disallow:
So it's pretty empty. Is this OK? I have a feeling that file should have more info.
And this is my sitemap.xml
file:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://example.com/</loc>
</url>
<url>
<loc>https://example.com.com/blog/</loc>
</url>
</urlset>
What do I need to do to get my site noticed by Google and other search engines?