-1

So i have this huge DB schema from a vehicle board cards, this data is actually stored in multiple excel files, my job was to create a database scheema to dump all this data into a MySql, but now i need to create the process to insert data into the DB.

This is an example of how is the excel tables sorted: Excel Data example

The thing is that all this excel files are not well tagged. My question is, what do i need to do in order to create a script to dump all this data from the excel to the DB? I'm also using ids, Foreign keys, Primary Keys, joins, etc.

enter image description here

I've thought about this so far:

1.-Normalize the structure of the tables in Excel in a good way so that data can be inserted with SQL language.

2.-Create a script in python to insert the data of each table.

Can you help out where should i start and how? what topics i should google?

Chris
  • 51
  • 1
  • 1
  • 10
  • I'd write a little VBA macro to export to delimited files, then read into MySQL with whatever tool you want. – Error_2646 Feb 26 '20 at 17:35

1 Answers1

0

With pandas you can easily read from excel (both csv and xlsx) and dump the data into any database

import pandas as pd
df = pd.read_excel('file.xlsx')
df.to_sql(sql_table)

If you have performance issues dumping to MySQL, you can find another way of doing the dump here python pandas to_sql with sqlalchemy : how to speed up exporting to MS SQL?

Javier Lopez Tomas
  • 2,072
  • 3
  • 19
  • 41