0

I need to crawl the entire page for elements, but only some of it loads until I scroll down. Is there anyway to counteract this? I can workaround this by running scroll to bottom functions, but it would be nice if their was a cleaner way of doing this.

Barry Michael Doyle
  • 9,333
  • 30
  • 83
  • 143
Eorsak
  • 1
  • Using the devtools of your browser, you should be able to check what the data source of the elements is, and send the request iteratively to that data source. It might look something like this: http://www.example.com/elements?paged=1; http://www.example.com/elements?paged=2; etc – Ceili Jun 19 '17 at 17:39

1 Answers1

0

here's the code to accomplish this. You can either plug it straight into the console or inject it into your crawler

window.scrollTo(0,document.body.scrollHeight);

you can also wrap this in a setInterval function to increment over time. Heres an example of this:

setInterval(function() { 
document.body.scrollTop = document.body.scrollHeight;
}, 50);

EDIT: sorry, i missed the last part of your question where you mentioned that you know you can do this. What kind of crawler are you working with. It would help in answering your question. Look at this example: Scroll Automatically to the Bottom of the Page

rimraf
  • 3,925
  • 3
  • 25
  • 55
  • Problem is, is that it only scrolls to the bottom of the loaded page, so when it does scroll to the bottom more just loads in. I could spam a ton of these functions but that would make the crawler a lot slower – Eorsak Jun 17 '17 at 20:03