0

I'm using Python Selenium to scrape Spotify for Artists webpage. I've built several scrapers and none of them faced any Out Of Memory issues, however the Spotify for Artist scraper is. Ran this script on a VPS with 8GB, now running it on a Windows 10 computer with 32GB Ram, shouldn't be an issue, but it is.

The problem: After X amount of loops/time, Chrome or FireFox (doesn't matter which driver I use) will run out of memory. The problem is that Python just "pauses" itself and won't execute any more code. When I click manually in Chrome on "Reload page", it just continues.

The goal: How do I catch the Out of Memory error and close the driver and restart it again? So it will continue unmonitored?

Screenshot of out of memory

Stan van der Avoird
  • 197
  • 1
  • 1
  • 15
  • You're not showing your Python code or any evidence of an Out Of Memory issue – DarkKnight May 18 '22 at 07:23
  • It's not the code itself. It's just switching from page to page eventually results in this Out of Memory issue. I'm not sure if the code has something to do with this. Isn't this more an issue with GB ram available + Chrome + the website itself stacking up memory? – Stan van der Avoird May 18 '22 at 07:45

0 Answers0