We are appending data in bigquery table from all the CSV files available on google drive. below is the code which is working fine for a single file (trainers.csv).
Need help in running all the files in a single go. How can i read all the available CSV files from google drive & save it to pandas dataframe & run my complete process in loop?
from google.colab import drive
drive.mount('/content/drive')
my_data = pd.read_csv('/content/drive/MyDrive/Vestiaire_data/july-2022/trainers.csv',encoding = 'ISO-8859-1',low_memory=False)
my_data.to_gbq(-------------)