I was getting the following error while loading the data to Google Cloud Storage.
I found couple of posts for the same but want to find where the memory leak is happening. Please let me know how to find.
Error Message:
Exceeded soft private memory limit of 128 MB with 171 MB after servicing 0 requests total
Code:
def download_files(service, report_id, report_fragment):
"""Generate and print sample report.
Args:
service: An authorized Doublelcicksearch service.
report_id: The ID DS has assigned to a report.
report_fragment: The 0-based index of the file fragment from the files array.
"""
print "Enter into download_files", report_id
filename="/xyz/DoubleClickSearch_Campaign"+report_id+"_"+report_fragment+".csv"
bucket_name="awstogcs"
fname="DoubleClickSearch_Campaign11212"
write_retry_params = _gcs.RetryParams(backoff_factor=1.1)
gcs_file=_gcs.open(filename, 'w', content_type='text/plain',retry_params=write_retry_params)
request = service.reports().getFile(reportId=report_id, reportFragment=report_fragment)
gcs_file.write(request.execute())
gcs_file.close()
Thanks,