0

I have a very large data. I am trying to apply Multinomial Logistic Regression to this data. While using "multinom" function I am getting error:

                     **Error: cannot allocate vector of size 3.3 GB**

I am not able to resolve this error since yesterday. I am using "ffdf" to read the data in R-studio.

Have tried memory.limit() and memory.size() but they didn't resolve my issue.

My RAM is 8 GB and I am using a 64-bit Windows laptop.

> sessionInfo()

R version 3.3.1 (2016-06-21) Platform: x86_64-w64-mingw32/x64 (64-bit) Running under: Windows Server >= 2012 x64 (build 9200)

locale: [1] LC_COLLATE=English_United Kingdom.1252 LC_CTYPE=English_United Kingdom.1252 [3] LC_MONETARY=English_United Kingdom.1252 LC_NUMERIC=C [5] LC_TIME=English_United Kingdom.1252

attached base packages: [1] stats graphics grDevices utils datasets methods base

other attached packages: [1] nnet_7.3-12 ffbase_0.12.3 ff_2.2-13 bit_1.1-12

loaded via a namespace (and not attached): [1] tools_3.3.1 fastmatch_1.0-4

Any help will be appreciated.

Thanks & regards, Pragati Jain

Richard Telford
  • 9,558
  • 6
  • 38
  • 51
  • Can anyone please provide me with the links of the solution to this problem as it is marked as Duplicate question. – Pragati Jain Jan 04 '17 at 12:25
  • Have tried the solutions given in other links with same question but haven't got it resolved. Can anyone please help on this? – Pragati Jain Jan 04 '17 at 12:41
  • I had same problem with R mlogit function. Used 200K rows aith 10 predictor variables and a 4-category dependent variable I got the same error message. I restarted my R Studio and then tried again and the function ran fine. I am assuming it has to do with memory. Especially if you have stored a bunch of stuff in your environment (variables, models, data frames etc.) – Dimitar Nentchev Apr 12 '19 at 15:27

0 Answers0