Suppose I have a base directory containing 1000.txt files with name such as CTxyx.ggg.txt. I have pasted these files as copy as path in an excel. I need only 100 of these files which will have different value of "ggg" depending on users requirements. How do I match these files in the directory and extract only 100 of 1000 files? I tried using fnmatch but didnt work. Can someone please suggest a code for this in python.
Asked
Active
Viewed 187 times
-1
-
You can use globs: https://docs.python.org/3.8/library/glob.html – gergelykalman Sep 18 '20 at 17:54
-
Have you done any research? _Can someone please suggest a code for this in python._ I believe that's off-topic. Please see [ask], [help/on-topic]. – AMC Sep 18 '20 at 23:02
-
Does this answer your question? [Get a filtered list of files in a directory](https://stackoverflow.com/questions/2225564/get-a-filtered-list-of-files-in-a-directory) – AMC Sep 18 '20 at 23:04
2 Answers
0
You can do something like this:
import glob
value = 'ggg'
txt_files = [for txt_file in glob.glob('*.txt') if value in txt_file]
glob
will search the current directory for a ".txt" match and you can only take the files containing value
.

PApostol
- 2,152
- 2
- 11
- 21
0
import fnmatch
import os
contents = []
for file in os.listdir('/Users/yourname/documents'):
if fnmatch.fnmatch(file, '*ggg*'):
contents.append(file)
This should work and will give you a list of all the filenames which match anything with ggg in. What you want to do with that list is up to you. If you want to be more specific with your search, you can just change ggg to suit your needs.

Brandon
- 18
- 6