0

I am looking to upload a very large zip file (several hundred GBs) from my remote server to my google drive using the drive api v3. I tried following a tutorial at Toward Data Science but it defers the use of resumable uploads to the drive api documentation, which isn't very beginner friendly. Other questions on this matter don't handle the file sizes I am handling. They also don't mention the issue of keeping the access-token valid for the time the file is being uploaded. I also found another SO answer during my search. However, it is again a "multi-part" upload method.

Any help would be appreciated. I am looking to automate this using a python script.

Thanks in advance!

starc52
  • 55
  • 5
  • First, I apologize that my answer was not useful for your situation. About your question, are these threads useful? https://stackoverflow.com/q/61759449/7108653 and https://stackoverflow.com/q/64587769/7108653 and https://stackoverflow.com/q/60528771 – Tanaike Feb 16 '23 at 00:37
  • @Tanaike https://stackoverflow.com/a/60536138/16476327 is quite relevant. However, I am unable to piece-together the chunking that I'll need to get from the test file to the large file. Where do I add the chunk_size parameter in this answer? – starc52 Feb 16 '23 at 17:13
  • Your file size is too big for one shot upload. Uploading files always involves a request timeout, max request size, etc... That's why you endup to documentation talking about resumable uploads. BTW, have a look to the GDrive API, its intended use and service limits. – Fabio B. Feb 16 '23 at 22:21
  • 1
    Thank you for replying. The sample script of https://stackoverflow.com/a/60536138 uses a single chank. If you want to use multiple chunks, you can see ["HTTP - multiple requests" of this official document](https://developers.google.com/drive/api/guides/manage-uploads#uploading). – Tanaike Feb 17 '23 at 01:58

1 Answers1

0

this is PHP example but maybe help you in php Upload Like this

public function uploadFileChunk($uploadFile)
{
    try {
        if ($uploadFile) {
            $client = $this->getClient();
            $service = new Google_Service_Drive($client);
            $uploadFileExp = explode('\\', $uploadFile);
            $uploadFileName = end($uploadFileExp);
            $file = new Google_Service_Drive_DriveFile();
            $file->setName($uploadFileName);
            $chunkSizeBytes = 100 * 1024 * 1024;
            $client->setDefer(true);
            $request = $service->files->create($file);
            $media = new Google_Http_MediaFileUpload(
                $client,
                $request,
                'application/octet-stream',
                null,
                true,
                $chunkSizeBytes
            );
            $status = false;
            $handle = fopen($uploadFile, "rb");
            $fileSize = filesize($uploadFile);
            $total_records = (int)($fileSize/$chunkSizeBytes);
            $media->setFileSize($fileSize);
            $i = 0;
            while (!$status && !feof($handle)) {
                ++$i;
                $chunk = fread($handle, $chunkSizeBytes);
                $status = $media->nextChunk($chunk);
            }

            $result = false;
            if ($status != false) {
                $result = $status;
            }

            fclose($handle);
            $client->setDefer(false);
            $fileId = $result->id;
            return $fileId;


        }
    } catch (Exception $e) {
        throw $e;
    }
}

And you need Refresh Your access token for uploading large files like 50GB+

if ($client->isAccessTokenExpired()) {

    $client->fetchAccessTokenWithRefreshToken($refreshToken['refresh_token']);

    file_put_contents($credentialsPath, json_encode($client->getAccessToken()));
}