I'm trying to write a crawler in Nodejs. The crawler is constantly scanning links, collecting information from those links, making calculations, rankings, scrapping, etc.
The only problem is that the memory usage continually grows, and never gets collected.
I've tried setTimeouts, process.nextTick(), setting variables to null, declaring global variables, reusing them to avoid garbage, etc.
The only effective way has always been restarting the app.
Is there a way to force garbage collection in production?