I am attempting to download ~55MB of json data from a webpage with PhanotomJS and python on Windows 10.
The PhantomJS process dies with "Memory exhausted" upon reaching 1GB of memory usage.
The load is made by entering a username and password and then using
myData = driver.page_source
on a page that just contains a simple header and the 55MB of text that makes up the json data.
It dies even if I'm not asking PhantomJS to do anything with it, just get the source.
If I load the page in chrome it takes about a minute to load, and lists it as having loaded 54MB, exactly as expected.
The phantomJS process takes about as long to reach 1GB RAM usage and die.
This used to work perfectly, until recently when the data to be downloaded exceeded about 50MB.
Is there a way to stream the data directly to a file from PhantomJS, or just some setting to not have it explode to 20x the necessary RAM usage? (The computer has 16GB of ram, the 1GB limit is apparently a problem in PhantomJS that they won't fix).
Is there an alternative, equally simple, way of entering a username and password and grabbing some data that doesn't have this flaw? (And does not pop up a browser window while working)