How i can import files from specific folders to hdf5 file type using python? files type is CSV. For example i have this path /root/Desktop/mal/ex1/ that contain many CSV files and make the groups as files names in the path.
Asked
Active
Viewed 119 times
0
-
Check this https://stackoverflow.com/questions/23334211/converting-csv-file-to-hdf5-using-pandas and this https://stackoverflow.com/questions/27203161/convert-large-csv-to-hdf5 – abdullah.cu Aug 22 '20 at 17:58
-
Iam trying to understand the answers in the both links, but i think my case is different. Anyway thanks for your support – M-kh Aug 22 '20 at 18:18
-
Your question is not clear.. – abdullah.cu Aug 22 '20 at 18:23
-
I have a path for example /root/ma/ma1 in this path i have multiple csv files about 16 csv file how can import all of those files into datasets dataset1 contain csvfile1 etc.... Thanks for your support again – M-kh Aug 22 '20 at 18:30
-
If you want to programmatically find and read several files in a folder (or folders), look at the `glob/iglob` functions. I prefer `iglob`; it returns an iterator; `glob` returns a list. You don't have to use `pandas` to read CSV data. I prefer to use Numpy `np.genfromtxt()` along with Pytables or h5py. Check out this example. It shows the process for 1 file. Add an `iglob` loop, and it will work for multiple files. (https://stackoverflow.com/questions/57120995/i-want-to-convert-very-large-csv-data-to-hdf5-in-python/57136066#57136066) – kcw78 Aug 22 '20 at 21:12
-
I know this is and old question, but check the accepted answer here. I believe it's doing what you expect https://stackoverflow.com/questions/61018053/iteratively-append-pandas-dataframes-in-a-single-group-to-h5-file/61019264#61019264 – underclosed Jun 01 '22 at 16:36