0

I would like to export table schema, table name, data type, and columns name to the JSON file. I managed to do so however, the data type does not return me the length of varchar.

How do I export the length of the varchar instead of getting character varying from information schema.

Export.py:

def GetAllTables():
    conn = None

    try:

        params = config()
        conn = psycopg2.connect(**params)
        cur = conn.cursor()
        cur.execute("""SELECT TABLE_SCHEMA, TABLE_NAME, COLUMN_NAME, DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS;""")
        
        rows = cur.fetchall()

    
        rowarray_list = [] 
        for row in rows:
            rowarray_list.append({'table_schema':row[0],
                                  'table_name': row[1],
                                  'column_name':row[2],
                                  'data_type':row[3]})

        rowarray_file= 'static/api/tables.json'
        with open(rowarray_file, 'w') as outfile:
            j = json.dump(rowarray_list, outfile,indent=4)

        cur.close()
    except(Exception, psycopg2.DatabaseError) as error:
        print(error)

    finally:
        if conn is not None:
            conn.close()    
     

This returns me such as:

{
    "table_schema": "x",
    "table_name": "x1",
    "column_name": "x11",
    "data_type": "character varying"
}

However, the data type in MySQL is varchar(255). This is what I would like to extract into the JSON file.

Denise
  • 139
  • 1
  • 2
  • 12
  • I'm confused: psycopg2 is an adapter for Postgresql, but you are asking about MySQL? – snakecharmerb Sep 07 '20 at 05:28
  • *However, the data type in MySQL is varchar(255).* VARCHAR means **VAR**ying **CHAR**acter. What's wrong? If you need to receive `varchar(255)` strictly then you must use proper CASE and, for character data types, investigate `CHARACTER_MAXIMUM_LENGTH` column and build proper value using string functions. – Akina Sep 07 '20 at 05:45

0 Answers0