0

I am looking for a tryCatch function in R that would retry n times instead of just once. One of my web request fails occasionally to return a value when the server is busy, but after one or two retries it usually works fine.

The excellent page How to write trycatch in R does not touch on this topic. I found the function TryRetry in C (orginally discussed in TryRetry - Try, Catch, then Retry) which accomplishes what I was looking for and I thought maybe a similar function exist in R in some package too?

Unfortunately, I don't have the skills to abstract an R code structure from the C example. I could just recall my function in the error handling portion of the tryCatch, but somehow this seems the wrong way to go, especially once you deal with more than one retry.

Any suggestions on how to approach a tryRetry-code structure in R would be appreciated.

sdittmar
  • 365
  • 2
  • 14
  • If you're using `httr` package to make the request you can wrap your request in a `RETRY` block passing a `times` argument to specify the max number of retries. See `httr` docs [here](https://www.rdocumentation.org/packages/httr/versions/1.4.1/topics/RETRY) – anddt Feb 05 '20 at 10:33

2 Answers2

2

You can implement a retry logic by relying on the RETRY method from the httr package and parsing the response in a second step. In order to apply it to file download I would go down the following path (using this hosted .csv file as an example):

library(httr)
library(dplyr)

df <- RETRY(
    "GET",
    url = "https://www.stats.govt.nz/assets/Uploads/Business-operations-survey/Business-operations-survey-2018/Download-data/business-operations-survey-2018-business-finance-csv.csv",
    times = 3) %>%  # max retry attempts
    content(., "parsed")
anddt
  • 1,589
  • 1
  • 9
  • 26
  • Thank you Andrea. I followed your suggestion and used `content(RETRY("GET", url = myLink), type = "text/csv")` which works for my case. `content()` is quite clever about selecting the right method to extract data from the `response` object that `RETRY` returns. – sdittmar Feb 05 '20 at 11:14
  • Indeed `content` covers basically 90% of response formats. When your response needs custom parsing you can always return it as text and parse it in a second step. – anddt Feb 05 '20 at 11:20
1

Here is a way of having a web read request tried several times before failing. It's an adaptation of the post linked to in the question, called in a loop a number of times chosen by the user. Between each try there is a Sys.sleep defaulting to 3 seconds.

I repost the function readUrl, changed. And with many comments deleted, they are in the original code.

readUrl <- function(url) {
  out <- tryCatch(
    {
      message("This is the 'try' part")

      text <- readLines(con=url, warn=FALSE) 
      return(list(ok = TRUE, contents = text))
    },
    error=function(cond) {
      message(paste("URL does not seem to exist:", url))
      message("Here's the original error message:")
      message(paste(cond, "\n"))
      # Choose a return value in case of error
      return(list(ok = FALSE, contents = cond))
    },
    warning=function(cond) {
      message(paste("URL caused a warning:", url))
      message("Here's the original warning message:")
      message(paste(cond, "\n"))
      # Choose a return value in case of warning
      return(list(ok = FALSE, contents = cond))
    },
    finally={
      message(paste("Processed URL:", url))
      message("Some other message at the end")
    }
  )    
  return(out)
}

readUrlRetry <- function(url, times = 1, secs = 3){
  count <- 0L
  while(count < times){
    res <- readUrl(url)
    count <- count + 1L
    OK <- res$ok
    if(OK) break
    Sys.sleep(time = secs)
  }
  res
}
url <- c(
  "http://stat.ethz.ch/R-manual/R-devel/library/base/html/connections.html",
  "http://en.wikipedia.org/wiki/Xz",
  "xxxxx")


res <- lapply(url, readUrlRetry, times = 3)
res[[3]]
inherits(res[[3]]$contents, "warning")
Rui Barradas
  • 70,273
  • 8
  • 34
  • 66