I need to move a hive table from one database to another. How can I do that?
8 Answers
Since 0.14, you can use following statement to move table from one database to another in the same metastore:
use old_database;
alter table table_a rename to new_database.table_a
The above statements will also move the table data on hdfs if table_a
is a managed table.

- 1,077
- 9
- 9
-
1Fastest and simplest solution. Wondering why it didn't click to me and so many others. :| Thanks. Should be the accepted answer. – pratpor Feb 23 '17 at 14:02
-
This is the right answer, provided you are using Hive metastore 0.14+. – Myles Baker Sep 01 '17 at 18:28
-
1@user2942227 - Why is this not accepted as the answer yet? – Ramesh Jan 12 '18 at 14:25
-
I'm using hive 2.1.1 the table is managed and the files/ table data are not moving with the rename. Is there anything else I need to do so the files move with it? – Topde Aug 23 '21 at 13:20
create external table new_db.table like old_db.table location '(path of file in hdfs file)';
if you have partition in table then you have to add partition in new_db.table.

- 234
- 1
- 2
- 12
You can try - CTAS
USE NEW_DB;
CREATE TABLE table
AS
SELECT * FROM OLD_DB.table;
DROP TABLE OLD_DB.table;

- 2,528
- 4
- 27
- 46
-
1Concise and simple, I like it! Note that CTAS has these restrictions: 1) The target table cannot be an external table. 2) The target table cannot be a list bucketing table. – KeyMaker00 Jun 04 '20 at 13:29
This might be helpful to you.
EXPORT TABLE table_or_partition TO hdfs_path;
IMPORT [[EXTERNAL] TABLE table_or_partition] FROM hdfs_path [LOCATION[table_location]];
Some sample statements would look like:
EXPORT TABLE <table name> TO 'location in hdfs';
Use test_db;
IMPORT FROM 'location in hdfs';
Export Import can be appled on a partition basis as well:
EXPORT TABLE <table name> PARTITION (loc="USA") to 'location in hdfs';
The below import commands imports to an external table instead of a managed one
IMPORT EXTERNAL TABLE FROM 'location in hdfs' LOCATION ‘/location/of/external/table’;

- 27,209
- 16
- 105
- 126

- 336
- 4
- 19
If the tables are partitioned, then basically you can copy partitioned data from old table to new table and delete old tables.
use new_db;
Create new table in new database:
Create Table table_name;
Insert data in new tables from old tables using the command:
insert into new_table_name partition (partition_column='value') select col1, col2, col3, col4 from old_db.old_table_name where partition_column='value';
The database you want to migrate it would have provided you with a database connector. With the help of sqoop and the database connector, you could be able to migrate it. would be great if you are more specific about the type of database you want to migrate the data to

- 414
- 5
- 17
https://issues.apache.org/jira/browse/HIVE-2496
that improvement is still open. I know that it's possible in Impala, not in Hive currently.
Development seems stalled. You can vote up for that issue on that page to bring some attention.

- 13,911
- 6
- 95
- 110
-
HIVE-2496 was fixed by HIVE-9180.. Although nobody has closed HIVE-2496 yet. As Jay has mentioned it's now possible (since 0.14) to rename Hive table between databases. – Tagar Jul 20 '15 at 22:23
There are many methods given in answers but no guide on when to use what,
Use CTAS when your target is a Managed table:
CREATE NEW_DB.TABLE table AS SELECT * FROM OLD_DB.TABLE; DROP TABLE OLD_DB.TABLE;
If you just want to do it with a simple table name change, but the location of the table doesn't change:
alter table old_database.table_a rename to new_database.table_a;
Using CREATE LIKE and INSERT:
CREATE NEW_DB.TABLE table LIKE OLD_DB.TABLE; INSERT INTO TABLE new_table_name partition (partition_column) select col1, col2, col3, col4 from old_db.old_table_name where partition_column='value'; DROP TABLE old_db.old_table_name;

- 27,209
- 16
- 105
- 126