I have a folder on my desktop with 3,000+ small .txt files in it. How can I get that folder into Hadoop with all of the .txt files instead of uploading each one separately?
I am using Ubuntu OS with Hadoop 3.1.2
I have a folder on my desktop with 3,000+ small .txt files in it. How can I get that folder into Hadoop with all of the .txt files instead of uploading each one separately?
I am using Ubuntu OS with Hadoop 3.1.2
HDFS is not meant to store "small files" of any type. You should compress them into a BZ2 archive, for example, then upload that to HDFS.
Bzip archives are readable by most Hadoop libraries.