0

I have 700MB file with rows in below json format. I want it in a dataframe with three columns hash,addresses,amount

so it tried

{"hash":"f2ccfd4fd000416109ce2adaf621e0a982ee3497fb733a45e2c4950ae4521a83","addresses":["3D8SFXR739SYr9EuvZrk2NtFbaw4HSQUXB"],"amount":"0.01232"}
{"hash":"2c7cb62456b326065a5cdeca54e7e4d681e748ffcd34442cfdce318f2f80e0b0","addresses":["1Ueu8QrxUkRpmMWKmWofpeMtAi6cHFyNj"],"amount":"0.0360826"}

I tried in R

final <- fromJSON(file="input")

gives me a list with just a single row and

cat input | sed -e "s/:/,/g" | awk -F"," '{print $2 "," $4 "," $6}' > inputsnew.csv

how to resolve?

Alexis
  • 2,104
  • 2
  • 19
  • 40
pranav nerurkar
  • 596
  • 7
  • 19
  • 2
    [This](https://stackoverflow.com/questions/16947643/getting-imported-json-data-into-a-data-frame/37739735#37739735) post might help. – DJJ May 25 '20 at 11:12
  • no it did not work. i tried it. my json file is in different format – pranav nerurkar May 25 '20 at 12:11
  • How does that `JSON` file get generated? It seems not to be compliant with standard format (you have multiple root elements not separated by a comma). You can validate any `JSON` blob using something like [this](https://jsonformatter.curiousconcept.com/) – anddt May 25 '20 at 12:49
  • i exported a bigquery table to google cloud storage – pranav nerurkar May 25 '20 at 15:35

0 Answers0