I have lists of files in folders where I want to delete the last entry; I have each of these saved as a .txt file. The lists will differ in length, so I can't say, for example, always delete the 7th member from the list. These have random names, so I just want to import a list of files (or a folder) into python and delete the last member.
I've found this bit of code: Delete Lines From a File in Python that works well for an individual file. This reads a .txt, deletes one line, and resaves as a .txt. For my purposes, this will work, because then I can input all of my updated .txt files into a merged dataframe.
with open('file1.txt', 'r+') as fp:
lines = fp.readlines()
fp.seek(0)
fp.truncate()
fp.writelines(lines[:-1])
However, I'm looking for a way to import all of my files without typing in their individual names. I've used glob
for this before eg:
interesting_files = glob.glob('/Users/location/*.txt')
dataframe1 = pd.concat((pd.read_csv(f, sep= ' ', names =['folder','num_files'])
for f in interesting_files))
So, how can I import many files instead of just file1.txt
—either one source location with all my files in there like glob does, or as a for loop? Are there other approaches that would work, like reading in each .txt file into a dataframe, then dropping the final row before reading in the next .txt file?