26

I have default db in hive table which contains 80 tables .
I have created one more database and I want to copy all the tables from default DB to new Databases.

Is there any way I can copy from One DB to Other DB, without creating individual table.

Please let me know if any solution.. Thanks in advance

Sohil Shivani
  • 336
  • 4
  • 19
Amaresh
  • 3,231
  • 7
  • 37
  • 60

4 Answers4

41

I can think of couple of options.

  1. Use CTAS.

    CREATE TABLE NEWDB.NEW_TABLE1 AS select * from OLDDB.OLD_TABLE1;
    CREATE TABLE NEWDB.NEW_TABLE2 AS select * from OLDDB.OLD_TABLE2;
    ...
    
  2. Use IMPORT feature of Hive https://cwiki.apache.org/confluence/display/Hive/LanguageManual+ImportExport

Hope this helps.

Achrome
  • 7,773
  • 14
  • 36
  • 45
Venkat Ankam
  • 916
  • 11
  • 17
  • Note that `show databases` will give the 80 names that are needed, and that the 80 lines can then easily be created in a tool like excel. – Dennis Jaheruddin Jan 27 '16 at 15:48
  • 4
    No. 1 worked great for me, except the partitions don't copy over. Any way to achieve that? – vk1011 Jan 16 '17 at 02:59
4

create external table new_db.table like old_db.table location '(path of file in hdfs file)';

if you have partition in table then you have to add partition in new_db.table.

bunty
  • 234
  • 1
  • 2
  • 12
4

These are probably the fastest and simplest way to copy / move tables from one db to other.

To move table source

Since 0.14, you can use following statement to move table from one database to another in the same metastore:

alter table old_database.table_a rename to new_database.table_a;

The above statements will also move the table data on hdfs if table_a is a managed table.

To copy table

You can always use CREATE TABLE <new_db>.<new_table> AS SELECT * FROM <old_db>.<old_table>; statements. But I believe this alternate method of copying database using hdfs dfs -cp and then creating tables with LIKE can be a little faster if your tables are huge:

hdfs dfs -cp /user/hive/warehouse/<old_database>.db /user/hive/warehouse/<new_database>.db

And then in Hive:

CREATE DATABASE <new_database>;
CREATE TABLE <new_database>.<new_table> LIKE <old_database>.<old_table>;
Darius
  • 10,762
  • 2
  • 29
  • 50
pratpor
  • 1,954
  • 1
  • 27
  • 46
1

You can approach one of the following option :

The syntax looks something like this: EXPORT TABLE table_or_partition TO hdfs_path; IMPORT [[EXTERNAL] TABLE table_or_partition] FROM hdfs_path [LOCATION [table_location]];

Some sample statements would look like: EXPORT TABLE TO 'location in hdfs';

Use test_db; IMPORT FROM 'location in hdfs';

Export Import can be appled on a partition basis as well: EXPORT TABLE PARTITION (loc="USA") to 'location in hdfs';

The below import commands imports to an external table instead of a managed one IMPORT EXTERNAL TABLE FROM 'location in hdfs' LOCATION ‘/location/of/external/table’;

Sohil Shivani
  • 336
  • 4
  • 19