3

Django has the convenience manage.py command dumpdata which can be configured to dump an entire database as JSON.

At present I am confined to using sqlalchemy and I'd like to do the same thing:

Take as input a connection string like 'mysql+pymysql://user:pwd@localhost:3306/' and get the contents of the database as JSON (I don't require all the meta info that Django provides, but I won't mind).

I found this question elaborating how to dump SQLAlchemy objects to JSON and this from the sqlalchemy documentation outlining how to get all tablenames from a database:

meta = MetaData()
input_db = f'sqlite:///tmpsqlite'
engine = create_engine(input_db)
meta.reflect(bind=engine)
print(meta.tables)

How do I retrieve all of the content of these tables and then convert them to JSON? Is there a built-in command in sqlalchemy similar to django's dumpdata functionality?

Sebastian Wozny
  • 16,943
  • 7
  • 52
  • 69

1 Answers1

6

Leaving my solution here for posterity:

import json

def dump_sqlalchemy(output_connection_string,output_schema):
    """ Returns the entire content of a database as lists of dicts"""
    engine = create_engine(f'{output_connection_string}{output_schema}')
    meta = MetaData()
    meta.reflect(bind=engine)  # http://docs.sqlalchemy.org/en/rel_0_9/core/reflection.html
    result = {}
    for table in meta.sorted_tables:
        result[table.name] = [dict(row) for row in engine.execute(table.select())]
    return json.dumps(result)
Sebastian Wozny
  • 16,943
  • 7
  • 52
  • 69
  • 1
    Note that the `zip(...)` is redundant. A [`RowProxy`](http://docs.sqlalchemy.org/en/latest/core/connections.html#sqlalchemy.engine.RowProxy) is a mapping in its own right and you can simply pass it to `dict()` constructor: `dict(row)`. – Ilja Everilä Nov 15 '17 at 13:33
  • Which is what you get, if you pass `dict()` a row proxy object. – Ilja Everilä Nov 15 '17 at 14:10
  • 1
    The imports required for sqlalchemy are `from sqlalchemy import create_engine; from sqlalchemy.schema import MetaData` – pigishpig May 12 '21 at 19:05
  • Is this generating one json file for a table? isn't could this break with a low memory limit or a huge amount of data per table? – Gabor Apr 03 '23 at 10:27