0

Is anyone aware of an R-based data analysis setup that works well in a research data centre with no internet access? I would like to use good reproducible analysis practices, but I do not have the permission upload files to a repository, for example. Also, RStudio preferences (such as path to a local package repository) are not saved. So far I know:

  1. The miniCRAN package helps with gathering all R package dependencies.

  2. local git version control does not require internet access. (Plus, the data centre technician may be willing to use it to release results and R scripts if the learning curve is outweighed by time savings in the long run.)

I am considering writing up a proposal for a pilot project, but I don't want to re-invent the wheel - especially if a more comprehensive platform already exists. I have looked, but haven't found one, unless Microsoft's R Open essentially does it (not clear to me).

Thanks for your consideration!

  • You can push to a local Git repository, and then take backups of your `.git` folder locally as a precaution. – Tim Biegeleisen Jun 13 '17 at 01:20
  • RStudio can be made [portable](https://support.rstudio.com/hc/en-us/articles/200534467-Creating-a-Portable-Version-of-RStudio-for-a-USB-Drive) – hrbrmstr Jun 13 '17 at 01:22
  • Having said that, the answer is likely bigger than a breadbox. Meaning, without knowing the actual restrictions (which you really shld not post in detail here) it's going to be hard to actually assist. – hrbrmstr Jun 13 '17 at 01:24
  • 1
    I think [Gitlab](http://www.gitlab.com) can be installed and run on a local server, so your IT department might be able to set something like that up for a github-style experience. – Marius Jun 13 '17 at 03:47
  • `git` has never required internet access. It cooperates with other computers but that ones can be in closed network, and for many cases they really are there. – max630 Jun 13 '17 at 11:41

0 Answers0