0

I have code that uploads my archive to Google Drive using my access token and requests, but if the file is larger than 512MB, it will fail with exit code MemoryError, so I'm searching for a way to fix this error and upload a file larger than 512MB. I already tried to find a solution but I didn't find anything where I could use an access token.

import os
import json
import requests
import ntpath
import oauth2
import httplib2
import oauth2client
from contextlib import closing
from googleapiclient.discovery import build
from oauth2client.client import GoogleCredentials

_CLIENT_ID = 'YOUR_CLIENT_ID'
_CLIENT_SECRET = 'YOUR_CLIENT_SECRET'
_REFRESH_TOKEN = 'YOUR_REFRESH_TOKEN'
_PARENT_FOLDER_ID = 'YOUR_PARENT_FOLDER_ID'
_ARCHIVE_FILE = os.environ['USERPROFILE'] +'\\Desktop\\WobbyChip.zip'


# ====================================================================================

def GetAccessToken(client_id, client_secret, refresh_token):
    cred = oauth2client.client.GoogleCredentials(None,client_id,client_secret,refresh_token,None,'https://accounts.google.com/o/oauth2/token',None)
    http = cred.authorize(httplib2.Http())
    cred.refresh(http)
    obj = json.loads(cred.to_json())
    _ACCESS_TOKEN = obj['access_token']
    return _ACCESS_TOKEN


def UploadFile(local_file, parent_folder_id, access_token,):
    headers = {'Authorization': 'Bearer ' +access_token}
    para = {
        'name': (ntpath.basename(local_file)),
        'parents': [parent_folder_id]}
    files = {
        'data': ('metadata', json.dumps(para), 'application/json; charset=UTF-8'),
        'file': open(local_file, 'rb')}
    requests.post(
        'https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart',
        headers=headers,
        files=files)

# ====================================================================================


if __name__ == '__main__':
    UploadFile(_ARCHIVE_FILE, _PARENT_FOLDER_ID, GetAccessToken(_CLIENT_ID, _CLIENT_SECRET, _REFRESH_TOKEN))
Valeriy
  • 1,365
  • 3
  • 18
  • 45
Temporaly
  • 11
  • 4
  • Are you sure Google *allows* you to do this? – Scott Hunter Feb 14 '19 at 20:57
  • 1
    @Temporaly For example, is this thread useful for your situation? https://stackoverflow.com/q/14286402/7108653 – Tanaike Feb 15 '19 at 00:27
  • @Tanaike Thanks to you I have, you really helped I used this post and edited my code, so now I'm able to upload the file larger than 512MB, but I have new problem I don't know how to include parent folder id (folder id where the file will be uploaded) in body – Temporaly Feb 15 '19 at 18:42
  • I think you will need to use chunked upload. it's trying to open and upload the file in one instance and running out of memory – black Phox Feb 15 '19 at 09:02

0 Answers0