0

I have a json query that gives me json of a joined table of person and pets:

SELECT json_object(
  'personId', p.id,
  'pets', json_arrayagg(json_object(
    'petId', pt.id,
    'petName', pt.name
  ))
  )
FROM person p LEFT JOIN pets pt
ON p.id = pt.person_id
GROUP BY p.id;

what would be the best way to export the result to json file that I can use mongoimport to load them to mongo?

im using some db client called TablePlus and their json export is weird, corrupting the date :\

jack miao
  • 1,398
  • 1
  • 16
  • 32
  • you could try dumping direct to a file as in: https://stackoverflow.com/a/935848/460557 – Jorge Campos Dec 28 '18 at 22:25
  • Best way is to use a script (php, python, perl) rather than using queries like above from my experience.. you could use php for example to store the json object in memory and just directly write it to MongoDB, rather than exporting to a file on disk. – cp50 Dec 29 '18 at 00:24
  • As an extension to above, without knowing the size of the db/table it's impossible to know the best way - as the way I suggested above depending on scale could mean you could run into OOM/race conditions (depending on parallelism). – cp50 Dec 29 '18 at 00:27

1 Answers1

0

You can export the results of a SQL query to a file hosted on the server that runs your mysql instance with a query like :

SELECT ... FROM ... WHERE ...
INTO OUTFILE '/tmp/my_query.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';

If your mysql server is running with option -secure-file-priv enabled, the location where the file can be written is restricted. Use the following command to see the allowed location, and adapt the INTO OUTFILE clause accordingly :

SHOW VARIABLES LIKE "secure_file_priv";

Once you have the data exported in CSV format you might need to reformat it to proper JSON using another tool, like php, perl or python.

GMB
  • 216,147
  • 25
  • 84
  • 135