I have several output files, here are two:
File1:
4
12
13
6
.....
File2
20
3
9
14
.....
Goal Output:
r_1 r_2
0 4 20
1 12 3
2 13 9
3 6 14
I need to bulk load them into a huge dataframe. Here's my start:
(1) Create Array of all the files:
allfiles = []
for root, dirs, files in os.walk(r'/my_directory_path/'):
for file in files:
if file.endswith('.csv'):
allfiles.append(file)
(2) Loading the files into pandas: (Problem is here)
big = pd.DataFrame
for i in allfiles:
file='/my_directory_path/' + i
big[i] = pd.read_csv(file,sep='\t',header=None)
The problem is the big[i]
, I need to make a new column within a for loop while passing i
.