Following code will download the data from DoubleClick Search and upload into Google cloud Storage(in GAE). The file is not uploading completely, every file is uploading only 32MB. Actual file size is 4GB. Is there any way I can upload the larger files to Google cloud Storage from App Engine.
Upload to Google Cloud Storage:
def _writeFilesinGCS(filename, data):
### Initializing Google cloud Storage Object
print "In _writeFilesinGCS function"
tmp_filenames_to_clean_up = []
write_retry_params = _gcs.RetryParams(backoff_factor=1.1)
gcs_file=_gcs.open(filename, 'w', content_type='text/plain',retry_params=write_retry_params)
gcs_file.write(data)
gcs_file.close()
tmp_filenames_to_clean_up.append(filename)
Download the file:
def download_files(service, report_run_id, report_fragment,loaddate,file_name,cfg):
"""Generate and print sample report.
Args:
service: An authorized Doublelcicksearch service.
report_id: The ID DS has assigned to a report.
report_fragment: The 0-based index of the file fragment from the files array.
"""
bucket_name = cfg._gcsbucket
bucket = '/' + bucket_name
filename = bucket + '/' + file_name + "_MMA_" + report_fragment + "_" + loaddate + ".csv"
print "Enter into download_files", report_run_id
request = service.reports().getFile(reportId=report_run_id, reportFragment=report_fragment)
_writeFilesinGCS(filename,request.execute())
dsbqfuns._dsbqinsert(report_run_id,cfg,file_name,1)