0

I was getting the following error while loading the data to Google Cloud Storage.

I found couple of posts for the same but want to find where the memory leak is happening. Please let me know how to find.

Error Message:

Exceeded soft private memory limit of 128 MB with 171 MB after servicing 0 requests total

Code:

def download_files(service, report_id, report_fragment):
  """Generate and print sample report.

  Args:
    service: An authorized Doublelcicksearch service.
    report_id: The ID DS has assigned to a report.
    report_fragment: The 0-based index of the file fragment from the files array.
  """
  print "Enter into download_files", report_id
  filename="/xyz/DoubleClickSearch_Campaign"+report_id+"_"+report_fragment+".csv"
  bucket_name="awstogcs"
  fname="DoubleClickSearch_Campaign11212"
  write_retry_params = _gcs.RetryParams(backoff_factor=1.1)
  gcs_file=_gcs.open(filename, 'w', content_type='text/plain',retry_params=write_retry_params)
  request = service.reports().getFile(reportId=report_id, reportFragment=report_fragment)
  gcs_file.write(request.execute())
  gcs_file.close()

Thanks,

user374374
  • 343
  • 3
  • 17
  • Possible duplicate of [Tracking down memory leak in Google App Engine Golang application?](https://stackoverflow.com/questions/38781096/tracking-down-memory-leak-in-google-app-engine-golang-application) – Dan Cornilescu Jul 02 '17 at 02:25
  • 1
    `after servicing 0 requests total` - that could indicate that the instance class you configured (or the default you have) may not have sufficient memory for your application, try configuring an instance class with more memory. – Dan Cornilescu Jul 02 '17 at 02:29
  • Look for `instance_class` in the [app.yaml syntax](https://cloud.google.com/appengine/docs/standard/python/config/appref#syntax). Also see [Instance classes](https://cloud.google.com/appengine/docs/standard/#instance_classes) – Dan Cornilescu Jul 02 '17 at 03:36
  • @Dan - Actually I am writing a file to Google Cloud Storage. Guessing, it is writing everything into memory and then copying to GCS. Because of that am I getting this error. I have given above code for writing file to GCS. Actually I need to write Bigger files to GCS so thought of checking. – user374374 Jul 03 '17 at 13:29
  • @Dan - Do I need to write data to GCS chunk by chunk to avoid the memory issue or this memory issue is totally different. – user374374 Jul 03 '17 at 14:03
  • Can't really say. Try to write a small chunk, if that works then indeed writing a single chunk is most likely the problem. In general I'd always go for chunk-by-chunk unless I know *for sure* that the max chunk size works well. Check out https://stackoverflow.com/questions/8201283/google-app-engine-how-to-write-large-files-to-google-cloud-storage – Dan Cornilescu Jul 03 '17 at 15:35
  • Thanks Dan for your reply. Do I need to use blobstorage in-order to write chunk by chunk without that cant I write to GCS chunk by chunk. – user374374 Jul 03 '17 at 17:45
  • No, that's just on the read side in that particular example - your concern is on the write side, so just look at that portion. – Dan Cornilescu Jul 04 '17 at 02:16
  • Dan - In the above link this is the code it was used to write into GCS and it uses blobstorage. PATH = '/gs/backupbucket/' for df in DocumentFile.all(): fn = df.blob.filename br = blobstore.BlobReader(df.blob) write_path = files.gs.create(self.PATH+fn.encode('utf-8'), mime_type='application/zip',acl='project-private') with files.open(write_path, 'a') as fp: for buf in iter_blobstore(df.blob): try: fp.write(buf) except files.FileNotOpenedError: pass files.finalize(write_path) – user374374 Jul 04 '17 at 02:20
  • `fp.write(buf)` is what you care about - make sure your data to write is split into small enough `buf` pieces to prevent running out of memory for the instance class you chose. – Dan Cornilescu Jul 04 '17 at 02:26

0 Answers0