3

I'm running some scripts from R that gets info from some webs. The problems is that even though I clean the session with gc(), the memory keep growing until my session crashes.

Here is the script:

library(XML)
library(RJDBC)
library(RCurl)

    procesarPublicaciones <- function(tabla){

        log_file <<- file(log_path, open="a")

        drv <<- JDBC("oracle.jdbc.OracleDriver", classPath="C:/jdbc/jre6/ojdbc6.jar"," ")
        con <<- dbConnect(drv, "server_path", "user", "password")

        query <- paste("SELECT * FROM",tabla,sep=' ')

        bool <- tryCatch( 
                { 
                ## Get a list of URLs from a DB
                listUrl <- dbGetQuery(con, query)
                if( nrow(listUrl) != 0) TRUE else FALSE
                dbDisconnect(con)
                },  error = function(e) return(FALSE)
                )
        if( bool ) {

            file.create(data_file)
            apply(listUrl,c(1),procesarHtml)
        }else{
            cat("\n",getTime(),"\t[ERROR]\t\t", file=log_file)
        }
        cat( "\n",getTime(),"\t[INFO]\t\t FINISH", file=log_file)
        close(log_file)
    }

    procesarHtml <- function(pUrl){

        headerGatherer <- basicHeaderGatherer()
        html <- getURI(theUrl, headerfunction = headerGatherer$update, curl = curlHandle)
        heatherValue <- headerGatherer$value()

        if ( heatherValue["status"] == "200" ){

            doc <- htmlParse(html)
            tryCatch
            (
                {
                    ## Here I get all the info that I need from the web and write it on a file.
                    ## here is a simplification
                    info1 <- xpathSApply(doc, xPath.info1, xmlValue)
                    info2 <- xpathSApply(doc, xPath.info2, xmlValue)
                    data <- data.frame(col1 = info1, col2=info2)
                    write.table(data, file=data_file , sep=";", row.names=FALSE, col.names=FALSE, append=TRUE)
                }, error= function(e)
                {
                    ## LOG ERROR
                }
            )
            rm(info1, info2, data, doc)
        }else{
            ## LOG INFO
        }
        rm(headerGatherer,html,heatherValue)
            cat("\n",getTime(),"\t[INFO]\t\t memory used: ", memory.size()," MB", file=log_file)
            gc()
            cat("\n",getTime(),"\t[INFO]\t\t memory used after gc(): ", memory.size()," MB", file=log_file)
    }

Even though I remove all internal variables with rm() and use gc(), memory keeps growing. It seems that all the html that I get from the web is kept in memory.

Here is my Session Info:

> sessionInfo()
R version 3.2.0 (2015-04-16)
Platform: i386-w64-mingw32/i386 (32-bit)
Running under: Windows XP (build 2600) Service Pack 3

locale:
[1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United States.1252   
[3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C                          
[5] LC_TIME=English_United States.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] RCurl_1.95-4.6 bitops_1.0-6   RJDBC_0.2-5    rJava_0.9-6    DBI_0.3.1     
[6] XML_3.98-1.1  

loaded via a namespace (and not attached):
[1] tools_3.2.0

--------------------EDIT 2015-06-08 --------------------

I'm still having the problem, but I found the same issue on other post, which is apparently resolved.

Serious Memory Leak When Iteratively Parsing XML Files

Community
  • 1
  • 1
Santiago P
  • 91
  • 1
  • 8

2 Answers2

2

When using the XML package, you'll want to use free() to release the memory allocated by htmlParse() (or any of the other html parsing functions that allocate memory at the C level). I usually place a call to free(doc) as soon as I don't need the html doc any more.

So in your case, I would try placing free(doc) on its own line prior to rm(info1, info2, data, doc) in your function, like this:

free(doc)
rm(info1, info2, data, doc)

In fact the call to free() may be sufficient enough that you could remove the rm() call completely.

Rich Scriven
  • 97,041
  • 11
  • 181
  • 245
  • I add the `free()` before `rm()` but doesn't work. Memory keep growing even though slower than before. Any other sugest? – Santiago P May 26 '15 at 16:14
-1

I had a related issue using htmlParse. Led to Windows crashing (out of memory) before my 10,000 iterataions completed.

Answer: in addition to free/remove - do a garbage collect gc() (as suggested in Serious Memory Leak When Iteratively Parsing XML Files ) every n iterations

Community
  • 1
  • 1
mikecro
  • 101
  • 1
  • 1
  • 6