I am developing a web page for my latest project. A bit late it struck me that I have to optimise it for search engines.
I guess I can guess the answer, but I don't like guessing...
When the user clicks the link I use jQuery to get new content and add it to the page dynamically. Is google crawling the .js part in some way? Or is it only links that I can see when doing view source that it uses?
Can the robot-files find those files I am fetching using .js?