I imported a large CSV file (about 2 GB), did some preliminary work with it, and closed R. Now I'm trying to boot up R studio again and the little 'loading' circle goes on and on then I get a warning that the session is taking a long time to boot. I click to "terminate" the current session but then the screen just goes white and I still can't access anything in R Studio. Help?
Asked
Active
Viewed 50 times
0
-
4My first guess is that Rstudio is trying to reload the environment. If you don't mind nuking all the objects in the global environment (warning: you have to recreate them again!) you could delete the `.Rdata` file (which may be hidden) and it should open again quickly, if that's the issue – Akhil Nair Apr 16 '17 at 19:38
-
2GB data does not sound like it would make the `.Rdata` file "too big for normal startup" (if that's what is happening), I wonder if you have several mid-processing copies of the data, multiplying its effect on your stored environment. If that's the case, before you exit RStudio next time, try "cleaning up" by removing all large-memory objects you no longer need. (There are a couple of answers to [this question](http://stackoverflow.com/questions/1358003/tricks-to-manage-the-available-memory-in-an-r-session) that list objects-by-memory.) – r2evans Apr 16 '17 at 20:01
-
Thanks, you were both right! I got it working now. This is my first time working with a big file in R (actually closer to 5GB now that I look at it...) and I'm eager to get it pared down to something more manageable. – garson Apr 16 '17 at 20:15
-
Although RStudio might increase memory requirements a bit, it's more likely that this is just an R constraints. – IRTFM Apr 17 '17 at 00:06