3

I need to download files from sftp server and parse them and insert to contents to the database.

I am currently using rCurl as follows:

library(RCurl) 
url<-c("sftp://data.ftp.net/incomining.data.txt")
x<-getURL(url, userpwd="<id>:<passwd>")
writeLines(x, incoming.data.txt"))

I also looked at download.file and I dont see sftp sufpport in download.file. Has anybody else done similiar work like this? Since I will be getting multiple files, I noticed that rcurl sometimes times out. I like to sftp download all the files from the sftp server first then process it. Any ideas?

user1471980
  • 10,127
  • 48
  • 136
  • 235

1 Answers1

2

It sounds like the question is "how do I avoid timeouts in rcurl?"

Increase the value of CURLOPT_CONNECTTIMEOUT. This is really just the same problem as Setting Curl's Timeout in PHP .

Edit, from comments below:

x<-getURL(url, userpwd="<id>:<passwd>", connecttimeout=60) // 60 seconds, e.g.
Community
  • 1
  • 1
Camille Goudeseune
  • 2,934
  • 2
  • 35
  • 56