I've been looking into this a while, but I'm finding it hard to find any examples for my specific case, I want to grab all CSV's in a folder on an FTP, then combine them together and then display them. I've been able to grab single files fine, but when mixing it with multiple and combining I tend to get an error stating
TypeError Traceback (most recent call last)
<ipython-input-14-7b3417be9f4e> in <module>
19 print (mycsvdir)
20
---> 21 csvfiles = glob.glob(os.path.join(mycsvdir , '*.csv'))
22 dataframes = []
23 for csvfile in csvfiles:
c:\users\xxx\appdata\local\programs\python\python37-32\lib\ntpath.py in join(path, *paths)
74 # Join two (or more) paths.
75 def join(path, *paths):
---> 76 path = os.fspath(path)
77 if isinstance(path, bytes):
78 sep = b'\\'
TypeError: expected str, bytes or os.PathLike object, not list
I combined all of them into a single file, and it shouldn't just be a list, so I'm guessing I've done something fundamentally wrong. full code-
import glob
import os
import pandas as pd
import ftplib
from ftplib import FTP
def grabFile(ftp_obj, filename):
localfile = open(filename, 'wb')
ftp.retrbinary('RETR ' + filename, localfile.write, 1024)
ftp = FTP('f20-preview.xxx.com')
ftp.login(user='xxx', passwd = 'xxx')
ftp.cwd('/testfolder/')
mycsvdir = []
ftp.dir(mycsvdir.append)
files = []
for line in mycsvdir:
print (mycsvdir)
csvfiles = glob.glob(os.path.join(mycsvdir , '*.csv'))
dataframes = []
for csvfile in csvfiles:
df = pd.read_csv(csvfile)
dataframes.append(df)
result = pd.concat(dataframes, ignore_index=True)
result.to_csv('all.csv', index=False)
data = pd.read_csv('all.csv')
data.head()
I 'm relatively new to python and a lot of my experience comes from reading very old posts and lessons on the matter, I apologize for my naivete