While exporting google spreadsheets from ndb datastore which is hardly of 2 MB size, eats up 128 Mb of run time memory of google app engine? how this is possible? i made a bulk.yaml file also, and i am using gapi calls and defer to export sheet on google app engine and it is showing error of EXceeding run time memory
-
http://tech.labs.oliverwyman.com/blog/2008/11/14/tracing-python-memory-leaks/ – BoboDarph Feb 22 '18 at 11:08
-
Are you using task queues or the deferred library? – new name Feb 22 '18 at 13:31
-
1The 128MB doesn't hold just your 2MB of data, it also holds your entire program's memory footprint (the code itself, with all imported libraries), operational and intermediate data structures, etc. Several considerations can be found in https://stackoverflow.com/questions/35189446/app-engine-deferred-tracking-down-memory-leaks/35192334#35192334 – Dan Cornilescu Feb 22 '18 at 18:36
1 Answers
I've had the same issue with the old Python gdata library when I exported data from Cloud Datastore (NDB lib) to Google Spreadsheets.
Normally, the issue didn't occur at the first export, but often at some later point. I was looking into the memory usage of instances over time and it was increasing with every export job.
The reason was a memory leak in my Python (2.7) code that handled the export. If I remember correctly, I had dicts and lists with plenty of references, some of them potentially in cycles, and the references haven't been explicitly deleted after the job was completed. At least with gdata there was a lot of meta-data in memory for every cell or row the code referenced.
I don't think this is an issue particular to Google App Engine, Spreadsheets, or the libraries you are using, but how Python deals with garbage collection. If there are references left, they will occupy memory.

- 1,377
- 1
- 18
- 29