2

I have a 6+gb .json file that I wish to import into R. My end goal would be to take the .json file and convert it into a table in R that I could use perform machine learning packages on.

I am having issues understanding what exactly I need to do to make this happen, as I do not quite understand the .json -> R packages as well as issues with the large file size.

Spacedman
  • 92,590
  • 12
  • 140
  • 224
numerator
  • 21
  • 3
  • Welcome to StackOverflow! Please read the info about [how to ask a good question](http://stackoverflow.com/help/how-to-ask) and how to produce a [minimal reproducible example](http://stackoverflow.com/questions/5963269/how-to-make-a-great-r-reproducible-example/5963610#5963610). This will make it much easier for others to help you. – Jaap Mar 25 '15 at 15:46
  • It's pretty simple, provided memory doesn't get to be an issue: `jsonlite::fromJSON("file.json")`. If you can't load that due to memory constraints, you'll have to explore other options. – Thomas Mar 25 '15 at 15:55
  • @Thomas Thank you for the assistance. I ran it and it came back as "Error in rawToChar(x) : long vectors not supported yet". How do you suggest I edit down the .json file, which tool is best to reduce the size by editing out entries? – numerator Mar 25 '15 at 21:56
  • Easiest solution would be to avoid such a huge file and instead change the upstream process to output this into a set of smaller files – Scott Stensland Mar 26 '15 at 17:29

0 Answers0