I have a data pipeline which parses, cleans and creates a data file with a few thousand rows. I need to move this data into mySQL into different tables. New data comes in every hour and my pipeline generates a new data file. Currently I am inserting/updating mySQL tables row by row iterating the data file.
I wanted to ask, is there a more efficient way to insert this data in mySQL?