0

Please help me to find solution for the problem with importing data from multiple csv files to one DataFrame in python. Code is:

import pandas as pd
import os
import glob

path = r'my_full_path' 
os.chdir(path)
results = pd.DataFrame()

for counter, current_file in enumerate(glob.glob("*.csv")):
     namedf = pd.read_csv(current_file, header=None, sep=",", delim_whitespace=True)
     results = pd.concat([results, namedf], join='outer')

results.to_csv('Result.csv', index=None, header=None, sep=",")

The problem is that some part of data are moving to the rows instead of new columns as required. What is wrong in my code?

P.S.: I found questions about importing multiple csv-files to DataFrame, for example here: Import multiple csv files into pandas and concatenate into one DataFrame, but solution doesn't solve my issue:-(

Cindy
  • 568
  • 7
  • 20

2 Answers2

0

it was solved by using join inside of pd.read_csv.read_csv() -> append(dataFrames) -> concat:

def get_merged_files(files_list, **kwargs):
dataframes = []
 for file in files_list:
    df = pd.read_csv(os.path.join(file), **kwargs)
    dataframes.append(df)
 return pd.concat(dataframes, axis=1)  
Cindy
  • 568
  • 7
  • 20
0

You can try using this:

import pandas as pd
import os

files = [file for file in os.listdir('./Your_Folder')]  # Here is where all the files are located.

all_csv_files = pd.DataFrame()

for file in files:
    df = pd.read_csv("./Your_Folder/"+file)
    all_csv_files = pd.concat([all_csv_files, df])
    all_csv_files.to_csv("All_CSV_Files_Concat.csv", index=False)