51

Is there a way to download a publicly-viewable Google Drive url via curl or wget? For example, being able to do something like:

curl -O myfile.xls https://drive.google.com/uc?export=download&id=1Wb2NfKTQr_dLoFJH0GfM0cx-t4r07IVl

Note, I'm looking to do this on a publicly-viewable file without having to sign into my Google account (or have someone else sign into their account, etc.).

If helpful, the cors headers I have are:

 "kind": "drive#file",
 "id": "1Wb2NfKTQr_dLoFJH0GfM0cx-t4r07IVl",
David
  • 9,288
  • 1
  • 20
  • 52
David542
  • 104,438
  • 178
  • 489
  • 842
  • Possible duplicate of [wget/curl large file from google drive](https://stackoverflow.com/questions/25010369/wget-curl-large-file-from-google-drive) – craq Sep 23 '19 at 22:45

13 Answers13

115

How about this method? When the file is such large size, Google returns a code for downloading the file. You can download the file using the code. When such large file is downloaded using curl, you can see the code as follows.

<a id="uc-download-link" class="goog-inline-block jfk-button jfk-button-action" href="/uc?export=download&amp;confirm=ABCD&amp;id=### file ID ###">download</a>

The query with confirm=ABCD is important for downloading the file. This code is also included in the cookie. At the cookie, you can see it as follows.

#HttpOnly_.drive.google.com TRUE    /uc TRUE    #####   download_warning_#####  ABCD

In this case, "ABCD" is the code. In order to retrieve the code from the cookie and download the file, you can use the following script.

Sample script :

#!/bin/bash
fileid="### file id ###"
filename="MyFile.csv"
curl -c ./cookie -s -L "https://drive.google.com/uc?export=download&id=${fileid}" > /dev/null
curl -Lb ./cookie "https://drive.google.com/uc?export=download&confirm=`awk '/download/ {print $NF}' ./cookie`&id=${fileid}" -o ${filename}

If this was not useful for you, I'm sorry.

Updated at February 17, 2022

Recently, it seems that the specification of this flow has been changed. So I updated this answer. In order to download a publicly shared file of large size from Google Drive, you can use the following script.

#!/bin/bash
fileid="### file id ###"
filename="MyFile.csv"
html=`curl -c ./cookie -s -L "https://drive.google.com/uc?export=download&id=${fileid}"`
curl -Lb ./cookie "https://drive.google.com/uc?export=download&`echo ${html}|grep -Po '(confirm=[a-zA-Z0-9\-_]+)'`&id=${fileid}" -o ${filename}
  • In this case, the ID for downloading is retrieved from the HTML data as follows.

      <form id="downloadForm" action="https://drive.google.com/uc?export=download&amp;id={fileId}&amp;confirm={value for downloading}" method="post">
    
  • When you want to download a publicly shared file of small size from Google Drive, you can use the following command.

      curl -L "https://drive.google.com/uc?export=download&id=### fileId ###" -o sampleoutput.csv
    
Tanaike
  • 181,128
  • 11
  • 97
  • 165
  • 1
    this is the only solution that works without installing anything, thanks – Frz Khan Feb 28 '19 at 06:47
  • Hi Tanaike, sometimes the cookie file will not contain the cookie with `.drive.google.com`. This would happen occasionally, I tried wget but still got a similar cookie file. Do you know why this could happen? – Stanley Wang Sep 10 '21 at 04:07
  • @shooding Thank you for your comment and for testing it. I'm glad your issue was resolved. Thank you, too. – Tanaike Jan 31 '23 at 22:54
  • These commands download the virus scan warning html page for me. – Steve May 02 '23 at 14:58
  • @Steve Thank you for your comment. About `These commands download the virus scan warning html page for me.`, I'm worried that the specification might be changed at Google side. So, when I tested the method of "Updated at February 17, 2022", I confirmed that both large-size and small-size files could be correctly downloaded. From this result, it seems that the specification is not changed. Unfortunately, I cannot replicate your situation. I apologize for this. But I would like to support you. When I could correctly replicate your situation, I would like to think of a solution. – Tanaike May 03 '23 at 04:36
36

Simplest and best way (with a real Google Drive file example)

  1. Install gdown using pip

    • Command - pip install gdown
  2. Let's say I wish to download cnn_stories.tgz from Google Drive

    • Download Link: https://drive.google.com/uc?export=download&id=0BwmD_VLjROrfTHk4NFg2SndKcjQ
  3. Please note the id URL parameter 0BwmD_VLjROrfTHk4NFg2SndKcjQ in the link

  4. That's it! Download the file using gdown

    • gdown --id 0BwmD_VLjROrfTHk4NFg2SndKcjQ --output cnn_stories.tgz


TLDR: gdown --id {gdrive_file_id} --output {file_name}


Command Line Args:

--id : Google drive file ID

--output: Output File name

Adithya Upadhya
  • 2,239
  • 20
  • 28
23

You need to use the -L switch to make curl follow redirects, and the correct switch for the filename is -o. You should also quote the URL:

 curl -L -o myfile.xls "https://drive.google.com/uc?export=download&id=0B4fk8L6brI_eX1U5Ui1Lb1FpVG8"
  • thanks, that works on small files. However, if the file is larger, using the above code I get something that says `"MyFile.csv (711M) is too large for Google to scan for viruses. Would you still like to download this file?`. How would I download a file of any size then? – David542 Jan 07 '18 at 00:11
  • You wouldn't be able to do that with curl or wget alone. You would need a program to recognise that page and submit a "Yes" response. –  Jan 07 '18 at 00:13
14

Simply

wget --no-check-certificate -r 'https://docs.google.com/uc?export=download&id=FILEID' -O DESTINEATION_FILENAME

source

Amr Lotfy
  • 2,937
  • 5
  • 36
  • 56
  • although it downloads the large file fine. But somehow it corrupted the file, by merging few files into one? – True Nov 24 '21 at 06:42
9

No need to install any external tool.

  1. Make file public on google drive
  2. Open shared link in incognito mode
  3. Click on download button
  4. Copy as curl download request from network tab
  5. Paste copied request on server & append output(--output <FILENAME>) flag

Step 4 Step 5

kunwar97
  • 775
  • 6
  • 14
6

I've just checked the answer of @tanaike and it works like a charm. But the solution, proposed by @Martin Broadhurst and accepted by the topicstarter doesn't.

Because google shows the warning about virus scan, which needs to be processed, soc script required.

I'd like to vote for the @tanaike's answer, but don't have enough reputation to do this :)

Additionally for those, who don't know how to get and ID of a file on google drive I'd like to share this pretty simple knowledge.

  1. go to your gDrive
  2. right click on it and select "Share option"
  3. choose public share for all, who have a link with no login required
  4. copy the URL https://drive.google.com/file/d/1FNUZiDDDDDDSSSSSSAAAAAdv42Qgzb6n8d/view?usp=sharing

  5. paste it to some notepad

  6. ID is a part of URL: 1FNUZiDDDDDDSSSSSSAAAAAdv42Qgzb6n8d

Enjoy!

Konstantyn
  • 61
  • 2
  • 2
    Thank you for your comment. In order to more easily download shared files, I created this. If this is useful for you, I'm glad. https://github.com/tanaikech/goodls – Tanaike Feb 23 '18 at 11:25
  • Not able to install it, probably because of some the same ignorance that makes me trying to install it... – Miguel May 23 '19 at 09:50
6

As of 18 Nov 2019 to use wget to download a file from Google Drive, I used the following method. For this method, we need to know, whether our file size comes under small or large cat. I could not figure out the exact number which differentiate between small and large sizes, but I suppose it is some where around 100 MB. But you can always use any of the two methods mentioned for your files, as one will only work for small and other for large.

Basic Steps to be followed

Step 1 Make your file to be shared as "Accessible to anyone having internet". This can be done by Right Clicking the file --> Click on Share option --> Click on Advances radio button --> Change access to "Public On the web"

Step 2 Save it and click Done Step 3 Again right click on the file and click on "Get Shareable Link". This will copy the link to clipboard.

Step 4 Copy everything after ?id= till end and save it to notepad file. This is your FILE_ID, which is used below.

Step 4 Follow below mentioned steps, based on file sizes, after performing above common steps.

Small Files

Step 1 Use the command:

wget --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILE_ID' -O FILE_NAME_ALONG_WITH_SUFFIX

FILE_ID should be copied from above step and FILE_NAME_ALONG_WITH_SUFFIX is the name of the file you want it to save it on your system/ server. Note that, do not forget to add the suffix like (.zip , .txt etc)

Step 2 Run the command. It may show "Will not apply HSTS" as error but its ok. Your file will be copied.

Large Files

Step 1 Use the command

wget --no-check-certificate --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILE_ID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILE_ID" -O FILE_NAME_ALONG_WITH_SUFFIX && rm -rf /tmp/cookies.txt

Change FILE_ID in 2 locations and FILE_NAME_WITH_SUFFIX once.

Step 2 Execute the command, it may give same error as mentioned above but thats ok.

Hope it helps..

Vaibhav
  • 71
  • 1
  • 3
2

For small file

Run following command on your terminal:

wget --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O FILENAME

In the above command change the FILEID by above id extracted and rename FILENAME for your own simple use.


For large file

Run the following command with necessary changes in FILEID and FILENAME:

wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILEID" -O FILENAME && rm -rf /tmp/cookies.txt

You can also use this single purpose website to generate this command for you.

2

You can use a tool called gdrive instead of wget/curl. It is basically a tool to access a google drive account from the command-line. Follow this example to set it up for a Linux machine:

  1. At first download the execution file of gdrive here.
  2. Decompress it and get the executable file gdrive. Now change the permissions of the file by executing the command chmod +x gdrive.
  3. Run ./gdrive about and you will get a URL asking you to enter for a verification code. As instructed in the prompt copy the link and go to the URL on your browser, then log into your Google Drive account and grant permission. You will get some verification code at last. Copy it.
  4. Go back to your previous terminal and paste the verification code you just copied. Then verify your information there.

Now once successfully completed the authentication process above you can navigate the files on your drive using the commands mentioned below.

./gdrive list # List all files' information in your account
./gdrive list -q "name contains 'University'" # serch files by name
./gdrive download fileID # Download some file. You can find the fileID from the 'gdrive list' result.
./gdrive upload filename  #  Upload a local file to your google drive account.
./gdrive mkdir # Create new folder

The process has to be done only once and it works seamlessly.

Gaurav Shrivastava
  • 905
  • 12
  • 19
0

For wget to work successfully for large files, use the link that asks confirmation to download.

First get the info from that google drive link that has ID. In your case the ID is 1Wb2NfKTQr_dLoFJH0GfM0cx-t4r07IVl.

And then get the file name that you want to download.

Then use this: """wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O- | sed -rn 's/.confirm=([0-9A-Za-z_]+)./\1\n/p')&id=FILEID" -O FILENAME && rm -rf /tmp/cookies.txt""" and replace the FILEID with the ID that you get from the link and the FILENAME variable with your file name

For any reference see this thread patiently https://gist.github.com/iamtekeste/3cdfd0366ebfd2c0d805 . You can get many useful methods.

SenthurLP
  • 124
  • 1
  • 7
0

Here's the best way to do it in CURL step by step:

Windows 10 have CURL by default but below applies to any Machines.

First generate a Direct Link of your file. A simple tool like this will help, I have listed some of what I used here in case the one here will go down someday.

Or just replace the XXX's below to your file ID.

https://drive.google.com/uc?export=download&id=XXXXXXXXXXXXXXXXXXXXXXXX

Note: Drive file should be available to anyone with the link. instructions are here.

Now open a new tab and paste the new generated links and that will automatically download the file.

However if it shows the infamous "scanning for virus / cannot scan for virus " page then we cannot use it yet with CURL.

You have to create a google drive API and use the below link structure instead:

https://www.googleapis.com/drive/v3/files/XXXXXXXXX?alt=media&key=YYYYYYYYY

XXXXX's is your file ID and YYYYY's is the API KEY (currently Version 3). Alternatively you can use this generator tool as an aid.

But again you need API first so head over HERE for instructions on how to.

Now that the direct link is ready, the most important thing to do is make sure that you have a USER AGENT set in your CURL command because without is you will received an error saying that you reach the limits. Here's the final command:

curl -A "Mozilla/5.0 (Windows NT 10.0; Win64; x64)" -L "https://www.googleapis.com/drive/v3/files/XXXXXXXXXXXXXXXXXXX?alt=media&key=YYYYYYYYYYYYYYYYYYYY" -o filename.zip

The above is set to the Latest Chrome on Windows User Agents but should work on any machines regardless. It only mimics the browser thinking it's Chrome for Windows 10 and latest. refer here for a the latest one as using an OLD UA is also throwing an error.

If you find it helpfull please let me know,

0

This solution works even for large files, without cookies.

The trick is the &confirm=yes query param. Then:

wget "drive.google.com/u/3/uc?id=FILEID&export=download&confirm=yes"
-3
curl gdrive.sh | bash -s 0B4fk8L6brI_eX1U5Ui1Lb1FpVG8

0B4fk8L6brI_eX1U5Ui1Lb1FpVG8 is file id.

井上智文
  • 1,905
  • 17
  • 14