I just noticed that Google didn't crawl quite a lot of my web pages. When I searched for that reason, I thought sitemap could be a reason for some part of that missing crawling because I had no sitemap or robots.txt kind of thing.
I have a very simple structured website with index, QnA, login(no user's page yet) and search page. But Search page can result in many restaurants' pages, which are upto almost 160,000 like example.com/restaurants/1000001 to example.com/restaurants/1160000.
I just learned some concepts about sitemap and I think dynamic sitemap examples are quite a lot in google. But I saw that over 50,000 pages require some more sitemaps google sitemap help page.
My website has very very simple structure but would it make some burden for my server?(is it necessary?) And I have no clear standard to split those 160,000 pages, then what can be a good standard to split them for googlebot?
Any tips would be a huge help for me. Thanks!