82

I'm wondering if anyone knows of a way to import data from a "big" xlsx file (~20Mb). I tried to use xlsx and XLConnect libraries. Unfortunately, both use rJava and I always obtain the same error:

> library(XLConnect)
> wb <- loadWorkbook("MyBigFile.xlsx")
Error: OutOfMemoryError (Java): Java heap space

or

> library(xlsx)
> mydata <- read.xlsx2(file="MyBigFile.xlsx")
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  : 
   java.lang.OutOfMemoryError: Java heap space

I also tried to modify the java.parameters before loading rJava:

> options( java.parameters = "-Xmx2500m")
> library(xlsx) # load rJava
> mydata <- read.xlsx2(file="MyBigFile.xlsx")
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  : 
   java.lang.OutOfMemoryError: Java heap space

or after loading rJava (this is a bit stupid, I think):

> library(xlsx) # load rJava
> options( java.parameters = "-Xmx2500m")
> mydata <- read.xlsx2(file="MyBigFile.xlsx")
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  : 
   java.lang.OutOfMemoryError: Java heap space

But nothing works. Does anyone have an idea?

amonk
  • 1,769
  • 2
  • 18
  • 27
user2722443
  • 821
  • 1
  • 7
  • 4
  • 7
    Have you considered saving your data into a more universal format, e.g. csv? – flodel Oct 02 '13 at 23:12
  • 3
    `gdata` is another option. I believe it is not java based, but I could be mistaken. – Ricardo Saporta Oct 03 '13 at 00:22
  • That's right, `gdata` uses Perl – Ben Oct 03 '13 at 02:36
  • 2
    Why is it that big? Lots of rows (do you need them all?), lots of columns (do you need them all?), lots of individual sheets (do you need them all?), one high-resolution embedded image (you don't need that...)? For spreadsheet and other binary files the size of the file in bytes is often not a useful measure of how big the data in it really is. – Spacedman Oct 03 '13 at 06:56
  • 3
    `gdata` works... very slowly, about 7 min by sheet but it works. – user2722443 Oct 03 '13 at 21:22
  • 1
    @flodel: you are right about csv, usually I do that. Unfortunately in my case, I have no choice because my inputs are several xlsx file with 5 sheets (10000 rows x 80 columns). I could manually open each with Excel and export in csv (or write some VBA codes to do that) but I'd rather do it entirely in R. – user2722443 Oct 03 '13 at 21:31
  • @Spacedman: My xlsx file only contents "raw data" (numeric and some factors). – user2722443 Oct 03 '13 at 21:35
  • 3
    I've been working on importing a colleague's monstrous, formula-laden Excel file (150 MB), and `gdata` was the only Excel package that could pull it off. As here, Java-based packages ran out of memory; `openxlsx` segfaulted. `gdata` took 30 minutes per sheet, but it got the job done. – Matt Parker Aug 29 '14 at 15:24
  • +1 gdata, had to load 12 excel tables mid-sized and xlsx took an horrendous amount of time. gdata made it a breeze. – Oeufcoque Penteano Nov 27 '15 at 00:24
  • `gdata` requires PERL. Anyone knows what is that? – HNSKD Oct 31 '16 at 07:54

8 Answers8

155

I stumbled on this question when someone sent me (yet another) Excel file to analyze. This one isn't even that big but for whatever reason I was running into a similar error:

java.lang.OutOfMemoryError: GC overhead limit exceeded

Based on comment by @DirkEddelbuettel in a previous answer I installed the openxlsx package (http://cran.r-project.org/web/packages/openxlsx/). and then ran:

library("openxlsx")
mydf <- read.xlsx("BigExcelFile.xlsx", sheet = 1, startRow = 2, colNames = TRUE)

It was just what I was looking for. Easy to use and wicked fast. It's my new BFF. Thanks for the tip @DirkEddelbuettel!

General Grievance
  • 4,555
  • 31
  • 31
  • 45
orville jackson
  • 1,868
  • 1
  • 12
  • 16
  • 1
    I tried so many methods to read a big .xslx file, but nothing seemed to work for me. I was getting an error when I was using Schaun Wheeler's function at github, and could not figure out how to use the perl command in gdata for my computer. 'openxlsx" is such a life saver for me. Thanks @Dirk Eddelbuettel and Orville Jackson. – nasia jaffri Oct 27 '14 at 15:31
  • Do you know of another solution? I can't find a way to open .xls files with openxlsx – user124123 Jan 07 '15 at 14:44
  • You could try the read.xls function in the gdata package. Never used it myself but worth a shot. – orville jackson Jan 08 '15 at 21:33
  • 2
    openxlsx is the only library that worked for my excel file (70Mo). but i had first to convert from .xls to .xlsx – agenis Mar 04 '16 at 15:35
  • OpenXLSX has the disadvantage that it does not recognize dates. To me, read_excel from the package readxl seems like the way to go. – peer Aug 28 '18 at 18:32
  • If openxlsx also leads to same error. Then, increase the RAM size if working on datalakes having option to change the configuration. – Abhishek Oct 10 '18 at 09:33
16
options(java.parameters = "-Xmx2048m")  ## memory set to 2 GB
library(XLConnect)

allow for more memory using "options" before any java component is loaded. Then load XLConnect library (it uses java).

That's it. Start reading in data with readWorksheet .... and so on. :)

David Maust
  • 8,080
  • 3
  • 32
  • 36
viquanto
  • 161
  • 1
  • 2
  • 3
    Thanks for the tip. Important to note: I had to issue the `options(java.parameters = "-Xmx2048m")` before issuing `require('rJava')` when using this within R-Studio. Unfortunately I'm getting a new error, now: "java.lang.OutOfMemoryError: GC overhead limit exceeded", but that's a different problem, I'm sure. – pbnelson May 21 '17 at 15:05
  • 1
    This worked for me, but I also had to make sure my R version matched my Java version (e.g. both 64-bit), and set the Java path correctly: `options(java.parameters="-Xmx4g") # increase java memory`, `Sys.setenv(JAVA_HOME='C:\\Program Files\\Java\\jdk-11.0.2') # for 64-bit version`, `library(rJava) # check it works` – Simon Woodward Mar 11 '19 at 02:22
10

I do agree with @orville jackson response & it really helped me too.

Inline to the answer provided by @orville jackson. here is the detailed description of how you can use openxlsx for reading and writing big files.

When data size is small, R has many packages and functions which can be utilized as per your requirement.

write.xlsx, write.xlsx2, XLconnect also do the work but these are sometimes slow as compare to openxlsx.

So, if you are dealing with the large data sets and came across java errors. I would suggest to have a look of "openxlsx" which is really awesome and reduce the time by 1/12th.

I've tested all and finally i was really impressed with the performance of openxlsx capabilities.

Here are the steps for writing multiple datasets into multiple sheets.

install.packages("openxlsx")
library("openxlsx")

start.time <- Sys.time()

# Creating large data frame
x <- as.data.frame(matrix(1:4000000,200000,20))
y <- as.data.frame(matrix(1:4000000,200000,20))
z <- as.data.frame(matrix(1:4000000,200000,20))

# Creating a workbook
wb <- createWorkbook("Example.xlsx")
Sys.setenv("R_ZIPCMD" = "C:/Rtools/bin/zip.exe") ## path to zip.exe

Sys.setenv("R_ZIPCMD" = "C:/Rtools/bin/zip.exe") has to be static as it takes reference of some utility from Rtools.

Note: Incase Rtools is not installed on your system, please install it first for smooth experience. here is the link for your reference: (choose appropriate version) https://cran.r-project.org/bin/windows/Rtools/

check the options as per link below (need to select all the check box while installation) https://cloud.githubusercontent.com/assets/7400673/12230758/99fb2202-b8a6-11e5-82e6-836159440831.png

# Adding a worksheets : parameters for addWorksheet are 1. Workbook Name 2. Sheet Name

addWorksheet(wb, "Sheet 1")
addWorksheet(wb, "Sheet 2")
addWorksheet(wb, "Sheet 3")

# Writing data in to respetive sheets: parameters for writeData are 1. Workbook Name 2. Sheet index/ sheet name 3. dataframe name

writeData(wb, 1, x)

# incase you would like to write sheet with filter available for ease of access you can pass the parameter withFilter = TRUE in writeData function.
writeData(wb, 2, x = y, withFilter = TRUE)

## Similarly writeDataTable is another way for representing your data with table formatting:

writeDataTable(wb, 3, z)

saveWorkbook(wb, file = "Example.xlsx", overwrite = TRUE)

end.time <- Sys.time()
time.taken <- end.time - start.time
time.taken

openxlsx package is really good for reading and writing huge data from/ in excel files and has lots of options for custom formatting within excel.

The interesting fact is that we don’t have to bother about java heap memory here.

ayush varshney
  • 517
  • 7
  • 20
  • Tested read.xlsx2, XLConnect, readxl and openxlsx and openxlsx is multiple times faster than others – Ali Jul 03 '17 at 06:53
9

I know this question is a bit old, but There is a good solution for this nowadays. This is a default package when you try to import excel in Rstudio with GUI and It works well in my situation.

library(readxl)

data <- read_excel(filename)
Stephen Rauch
  • 47,830
  • 31
  • 106
  • 135
Doongsil
  • 490
  • 4
  • 8
5

As mentioned in the canonical Excel->R question, a recent alternative which has emerged comes from the readxl package, which I've found to be quite fast, compared with, e.g. openxlsx and xlsx.

That said, there's a definite limit of spreadsheet size past which you're probably better off just saving the thing as a .csv and using fread.

Community
  • 1
  • 1
MichaelChirico
  • 33,841
  • 14
  • 113
  • 198
3

I also had the same error in both xlsx::read.xlsx and XLConnect::readWorksheetFromFile. Maybe you can use RODBC::odbcDriverConnect and RODBC::sqlFetch, which uses Microsoft RODBC, which is much more efficient.

Jota
  • 17,281
  • 7
  • 63
  • 93
Jingnan Li
  • 31
  • 3
2

@flodel's suggestion of converting to CSV seems the most straightforward. If for whatever reason, that's not an option, you can read in the file in chunks:

 require(XLConnect)
 chnksz <- 2e3
 s <- <sheet>
 wb <- loadWorkbook(<file>, s)
 tot.rows <- getLastRow(wb)
 last.row =0
 for (i in seq(ceiling( tot.rows / chnksz) )) {
    next.batch <- readWorksheet(wb, s, startRow=last.row+i, endRow=last.row+chnksz+i)
    # optionally save next.batch to disk or 
    # assign it to a list. See which works for you. 
 } 
Ricardo Saporta
  • 54,400
  • 17
  • 144
  • 178
  • Unfortunately, the `loadWorkbook` command generates an "OutOfMemoryError". With the same idea, I tried `mydata.chunk = read.xlsx2(file="MyBigFile.xlsx", sheetIndex=1, startRow=1, endRow=10)`, but it's still the same error. – user2722443 Oct 03 '13 at 21:43
  • @user2722443, are you saving the portions you've read in, then removing them from memory? also try running `gc()` in each for loop. It will slow you down, but clear out some memory. Incidentally, are you sure that converting to CSV is out of the quesiton? – Ricardo Saporta Oct 03 '13 at 21:43
  • 1
    @{Ricardo Saporta} in fact the `mydata.chunk = read.xlsx2(file="MyBigFile.xlsx", sheetIndex=1, startRow=1, endRow=10)` generates an "OutOfMemoryError". So I can't remove anything. Concerning the CSV conversion, it's not totally out of the question but it's an external operation (before loading in R). – user2722443 Oct 10 '13 at 20:04
0

I found this thread looking for an answer to the exact same question. Rather than try to hack an xlsx file from within R what ended up working for me was to convert the file to .csv using python and then import the file into R using a standard scanning function.

Check out: https://github.com/dilshod/xlsx2csv

aaron
  • 6,339
  • 12
  • 54
  • 80
  • 1
    ... which is what has been available for a decade in the gdata package for R (but using Perl behind the scenes). – Dirk Eddelbuettel Jun 19 '14 at 00:56
  • when i worked on the problem using gdata it was unacceptably slow. this python scripts converts large xlsx files extremely quickly – aaron Jun 19 '14 at 01:16
  • 1
    How is this answer different from @flodel's suggestion mentioned in another answer? IMHO RODBC has few advantages over intermediate CSV format. – mlt Jun 19 '14 at 01:28
  • 8
    There is also a new kid on the block: [openxlsx](http://cran.r-project.org/web/packages/openxlsx/index.html) which uses just Rcpp and nothing but C++ code--and claims to be very fast. Not sure how refined it is. – Dirk Eddelbuettel Jun 19 '14 at 01:29
  • why wouldn't you just open it in excel and export to CSV? – MattE Jul 02 '17 at 00:21