I am trying to download files from an internal website using download.file.
It works fine for the most time and I know, on the back end, it is built on SQL.
However, if the SQL query takes too long time to execute (for example if I increase the number of years), it throws the error message below.
(If I open the download path directly in a web browser, it will show a blank screen, and after 5 minutes or so, it will start to download the file)
Error in download.file(., destfile = path, :
cannot open URL ''
In addition: Warning message:
In download.file(., destfile = path, :
InternetOpenUrl failed: 'The operation timed out'
I want to know if there is a way I can increase the time out limit, so download.file can wait until the query ends.
(I cannot access the database or change the query directly)
Attempts:
options(timeout = 100000000)
does not workoptions(windowsTimeouts = 100000000)
does not workcacheOK = FALSE
does not workRCurl::curlSetOpt(timeout = 100000000)
does not work
Updates:
- 04/27/2020: I found the question What's the “internal method” of R's download.file?, and it talked about the C function behind the download.file (please see internet.c and nanohttp.c for more details). Is there a way I can increase the timeout by editing these files?