0

I am making a system that is reading huge csv files and is storing the data of them in a database. With huge files I mean 90 csv files with 30000 rows in each file. And every row has around the 30-40 columns.

I was wondering what the fastest way was to loop trough all those files.

Currently I was using the default build-in csv reader. But I dont know if that is the fastest/most efficient way.

Does someone know a better way?

Cœur
  • 37,241
  • 25
  • 195
  • 267
user3824329
  • 95
  • 1
  • 3
  • 13

0 Answers0