I have an external table that reads data from the HDFS location (/user/hive/warehouse/tableX) all files and created a external table in Hive.
Now, let's assume there's some pre-partitioning of the data and all the previous files are spitted in several directories with a specific name convention <dir_name>_<incNumber> e.g.
/user/hive/warehouse/split/
./dir_1/files...
./dir_2/files...
./dir_n/files...
how can I create another external table that keeps track of all files in the split folder?
Do I need to create an external table that is partitioned on each sub-folder (dir_x)?
Also for that, is it needed some kind of Hive or shell script that can create/add a partition for each sub-directory?