I'm using Python Selenium to scrape Spotify for Artists webpage. I've built several scrapers and none of them faced any Out Of Memory issues, however the Spotify for Artist scraper is. Ran this script on a VPS with 8GB, now running it on a Windows 10 computer with 32GB Ram, shouldn't be an issue, but it is.
The problem: After X amount of loops/time, Chrome or FireFox (doesn't matter which driver I use) will run out of memory. The problem is that Python just "pauses" itself and won't execute any more code. When I click manually in Chrome on "Reload page", it just continues.
The goal: How do I catch the Out of Memory error and close the driver and restart it again? So it will continue unmonitored?