2

I have huge data in a directory tree format like:
c:/user/name/class/std/section I to VI [all section has individual folder i.e. 6 folder in total and all folder have 100+ files to be processed]
I wrote a script in which if I give the folder containing the files e.g.Section I then it would process files inside using glob.iglobfunction.
Is it possible to write a script which can walk along directories by just entering one directory > processing file > leaving directory > entering different directory > and so on.
please help.

diffracteD
  • 758
  • 3
  • 10
  • 32

3 Answers3

4

you can try this code :

import os
folder = 'C:'

for root, dirs, files in os.walk(folder):
    for name in files:
        print os.path.join(root, name)
    for name in dirs:
        print os.path.join(root, name)

UPDATE:

import os folder = 'C:'

for root, dirs, files in os.walk(folder):
    for name in files:
        nm, ext = os.path.splitext(name)
        if ext == ".csv":
            print os.path.join(root, name)
urcm
  • 2,274
  • 1
  • 19
  • 45
  • but suppose if i want to find and process *.csv files inside each folder, then what will be the easiest way..?? – diffracteD May 16 '12 at 11:58
0
import os
for (dirpath, dirnames, filenames) in os.walk(directory):
    # Do some processing

That will iterate through the root of the directory specified, for eg. c:/user/name/class/std and enter every folder contained in it, and give you the folders and files contained in them. With that you should be able to do what you need to processing wise.

Christian Witts
  • 11,375
  • 1
  • 33
  • 46
0

Addressing Abhisek's comment on Aragon's solution:

import os folder = 'C:'

for root, dirs, files in os.walk(folder):
    for name in files:
        (base, ext) = os.path.splitext(name)
        if ext in "csv":
            print os.path.join(root, name)
Simon Peverett
  • 4,128
  • 3
  • 32
  • 37