2

I am trying to export the data from hive table as below but it is generating file with tab as delimiter. Is it possible to specify the delimiter as comma or tab when exporting it?

hive -e "
use default;
set hive.cli.print.header=true;
select * from test1;
    " > /temp/test.txt (or .csv) 

Thanks J

sam
  • 23
  • 1
  • 3
  • 1
    Possible duplicate of [exporting Hive table to csv in hdfs](https://stackoverflow.com/questions/30224875/exporting-hive-table-to-csv-in-hdfs) – alejandrogiron Jun 02 '17 at 16:33

2 Answers2

2

You can pipe your data and use a simple sed command to convert your data into required output format. Below sed command replaces tab with comma. Default output is Tab. You can change the output accordingly.

hive -e "use default;set hive.cli.print.header=true;select * from test1;" | sed 's/[\t]/,/g' >/temp/test.csv
Prabhat Ratnala
  • 650
  • 5
  • 17
  • Thank you Prabhat. I did try the sed command, what I see is it it putting a comma at the end of the tab .ike ABC ,123 , 2016 ,.. Instead of command, if I want to use pipe as delimiter,, what I can use? – sam Jun 06 '17 at 03:54
  • hive -e "use default;set hive.cli.print.header=true;select * from test1;" | sed 's/[\t]/|/g' >/temp/test.csv This would replace the tabs with pipes – Prabhat Ratnala Jun 06 '17 at 19:06
0

Alternatively to @Prabhat Ratnala solution, which is OK, you can use this:

INSERT OVERWRITE LOCAL DIRECTORY '/temp/your_dir' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' SELECT * FROM test1;

leftjoin
  • 36,950
  • 8
  • 57
  • 116
  • Thank you..I tried this before,, using insert overwrite, it is invoking a map red job and it is creating multiple splits of the file – sam Jun 06 '17 at 03:56
  • Sure, and then you use cat to merge them all into one file. – Andrew Jun 14 '17 at 19:57