0

I am using RJDBC package for R to Oracle connection and ojdbc6.jar file.

I have a column of clob datatype in which there is HTML encoding as a data and i am trying to fetch that data in r but its not giving me the result its hang up my system. Below is my code.

There are 200000 rows in my table

queryToRun < - paste(SELECT * FROM ABData)

output <- data.frame(dbGetQuery(jdbcConnection, queryToRun), stringAsFactors = FALSE)

How can i do that efficiently?

Oliver
  • 8,169
  • 3
  • 15
  • 37
  • A few notes: There is no need to `data.frame` or `paste` in these simple queries. Also your code is not correct as it. `data <- dbGetQuery(jdbcConnection, 'SELECT * from ABData')` would give your desired result. For efficiency, the question is how much of the clob data is in fact "long strings" and how much is just simpler string. 200k rows is not bad at all, but it depends on the complete size of the table. – Oliver Jul 30 '20 at 12:17
  • 1
    More specifically try checking the memory size of the table: [`dbGetQuery(jdbcConnection, "select bytes/1024/1024 as MB from user_segments where segment_name = 'ABData' ")`](https://stackoverflow.com/a/27013071/10782538). Remember that the table has to be downloaded, and then processed. Depending on how big the table is, how fast your connection is, and how powerful your pc is this might take a bit. – Oliver Jul 30 '20 at 12:19

0 Answers0