0

I have a folder called detail_data , where in there are multiple folders with "number" as a name. Example: 1 , 4,8 etc , each of these folders contain one excel file called Measurement table. I used the following code but it returns an empty list.

path = pathname
df_list = []
for i in os.listdir(path):
    if i.isdigit():
        pathfile = path+'/'+i+"/Measurement table.csv"
        df= pd.read_csv(pathfile)
        df_list.append(df)

Why is it returning an empty list? Is there any other way to loop into all these folders , there are about 40 folders like these.

Sca
  • 175
  • 1
  • 13
  • look at os.walk() – manu190466 Apr 22 '20 at 07:51
  • 3
    i think there are multiple answers on SO about getting files recursively from folders. here is one u can look at : [link](https://stackoverflow.com/questions/20906474/import-multiple-csv-files-into-pandas-and-concatenate-into-one-dataframe) – sammywemmy Apr 22 '20 at 07:52
  • Does this answer your question? [Import multiple csv files into pandas and concatenate into one DataFrame](https://stackoverflow.com/questions/20906474/import-multiple-csv-files-into-pandas-and-concatenate-into-one-dataframe) – Mayank Porwal Apr 22 '20 at 07:57
  • if your file is called `Measurement Data`, you should not use `Measurement table.csv` to read it, it simply won't work – lenik Apr 22 '20 at 08:07
  • Yes i have used. There is an error called Permission denied. – Sca Apr 22 '20 at 08:43

1 Answers1

0

You may use os.walk. It provides a lot of goodies for code intended to go thru all files in all sub dirs and more.

import os

dict_of_files = {}
for (dirpath, dirnames, filenames) in os.walk(path):
    for filename in filenames: 
        do_somthing (read/write etc)
Ohad the Lad
  • 1,889
  • 1
  • 15
  • 24
  • 1
    Please don't post only code as an answer, but also provide an explanation what your code does and how it solves the problem of the question. Answers with an explanation are usually of higher quality, and are more likely to attract upvotes. – Mark Rotteveel Apr 22 '20 at 10:57