I created more than 500 000 JSON documents through a script connecting to some API. I wanted to import these documents into RethinkDB, but it seems that RethinkDB cannot import files massively, so I thought about merging all these files into a big JSON file (say bigfile.json). Here is their structure :
file 1.json:
{
"key_1": "value_1.1",
"key_2": "value_1.2",
"key_3": "value_1.3",
...
"key_n": "value_1.n"
}
file 2.json:
{
"key_1": "value_2.1",
"key_2": "value_2.2",
"key_3": "value_2.3",
...
"key_n": "value_2.n"
}
...
file n.json:
{
"key_1": "value_n.1",
"key_2": "value_n.2",
"key_3": "value_n.3",
...
"key_n": "value_n.n"
}
I was wondering which would be the best structure to create a big JSON file (to be complete, each file has a specific name composed by 3 variables, the first one being a timestamp (YYYYMMDDHHMMSS)), and which command or script (until now I only wrote scripts for bash...) would allow me to produce the merging.