I am trying to move data from one database (db1) to another (db2).
After a few attempts, I came up with the code below, but it is too slow. According to my estimations, data transferring will take +7k hours.
Is there any way to make it faster? Can we process more than 1 row from cursor at once?
batch_size=10
transfer_table_name='my_table'
with db1.cursor(name=’my_cursor') as cursor:
cursor.itersize=batch_size
cursor.execute(‘select * from schema.table’)
for row in cursor:
csv_io = io.StringIO()
writer=csv.writer(csv_io)
writer.writerow(row)
csv_io.seek(0)
db2.cursor().copy_expert(
'COPY '+
transfer_table_name+
""" FROM STDIN WITH (NULL '', DELIMITER ',', FORMAT CSV)""",
csv_io
)
db2.commit()