3

For Context, I have been using RStudio to pull data from our reporting platform (Domo) then I manipulate the tables in R, and send the df back into Domo at the end. The start of the code worked perfectly fine for a few months but is now returning a "Error in Curl" message. Here is the exact code/error:

library (DomoR)
library (dplyr)
DomoR::init('Company-Name', 'TOKEN-NUMBER')
Total_Vendor_Revenue <- DomoR::fetch('DATA-FILE-NUMBER')

Error in curl::curl_fetch_memory(url, handle = handle) : 
Failed writing body (0 != 16384) 

Do I need to clear a certain memory within my computer? Is there a way to clear it after the code completely runs? Any information helps- thanks!

NelsonGon
  • 13,015
  • 7
  • 27
  • 57
  • Heyy, just wondering if you got the answer by any chance? I'm facing a similar problem now..... :( – Mr369 Feb 14 '19 at 20:23

1 Answers1

0

I think this means the table you are trying to fetch is too big. You can probably filter the table down in size in an ETL then pull the ETL's data set into R?