I have a use case where I have to convert HDFS file format to csv or tsv. I know a way where we can create hive table on top of hdfs file format and than store the data as required format. But for this, I required information of data(like column name to create table). Is there any other way using hive or something else which convert any file format of hdfs to csv or tsv?
Asked
Active
Viewed 1,686 times
-1
-
1What is your initial file format? Why don't you cat your file to see column names? I don't think without knowing details of your file, changing format is a good idea. – arctic_Oak Jan 04 '19 at 07:10
-
"HDFS file format" isn't a thing... – OneCricketeer Jan 05 '19 at 20:33
1 Answers
0
First of all.. to convert the data to CSV or TSV you will be needing your data as structured please check for this once.
The way which you specified to convert the data using Hive is one of the options.
Other option would be to use spark. -- Here you have to read the data in structured format and then while saving you have to convert to CSV. Please refer to the following link for more explanation:
How to export data from Spark SQL to CSV
You can use python as well to convert the data to CSV.

OneCricketeer
- 179,855
- 19
- 132
- 245

sk79
- 35
- 10