21

I have Oracle database with 10 tables. Some of the tables have CLOB data text. I need to export data from these tables pro-grammatically using java. The export data should be in ANSI INSERT INTO SQL format, for example:

INSERT INTO table_name (column1, column2, column3, ...)
VALUES (value1, value2, value3, ...);

The main idea is that I need to import this data into three different databases: ORACLE, MSSQL and MySQL. As I know all these databases support ANSI INSERT INTO. But I have not found any java API/framework for generating data SQL scripts. And I do not know how to deal with CLOB data, how to export it.
What is the best way to export data from a database with java?

UPDATE: (01.07.2018)
I guess it is impossible to insert text data more than 4000 bytes according to this answer. How to generate PL\SQL scripts using java programmatically? Or is there any other export format which supports ORACLE, MSSQL, etc?

Nazim Kerimbekov
  • 4,712
  • 8
  • 34
  • 58
Maksym
  • 2,650
  • 3
  • 32
  • 50
  • 1
    It might be possible to do the entire thing within Java, e.g. open connections to both Oracle (the source) and SQL Server (the destination), and iterate a table from Oracle, writing each record to SQL Server. – Tim Biegeleisen Jan 05 '18 at 11:29
  • The Connection has database meta data you can query for TABLEs. – Joop Eggen Jan 05 '18 at 11:31
  • 2
    Do you absolutely need a set of ANSI SQL files, or can you afford writing to a delimited text file and then load from it? It'd be much easier and faster if you dump to text and then import instead executing SQL for loading. – Serg M Ten Jan 09 '18 at 11:45
  • 1
    No, I do no absolutely need ANSI SQL file (but it would be perfect). For now I have decided to import all data into XML files. And write a little java tool with hibernate that will insert all data from this file into DB. And then I can change driver and apply it for different SQL vendors. – Maksym Jan 09 '18 at 12:04
  • 1
    I think you are much better with a DB-to-file-to-DB pipeline than with SQL scripts, even at the cost of custom dump/load as you'll hardly get any easy one-size-fits-all set of SQL scripts. – Serg M Ten Jan 09 '18 at 13:14
  • 1
    Best practice would be export the Clob as external file and load it through the file. I would suggest you read http://www.dba-oracle.com/t_insert_clob_table_column.htm – logger Jan 09 '18 at 18:39

7 Answers7

6

Did you ever think about a proper ORM-Api? The first thing in my mind would come to Hibernate or more abstract JPA/JPQL. The framework does know all the main sql dialects. All what you need is to define your connections with your dialects. Than you retrieve the data from the database and its mapped into POJO's, and than you push(insert) the data to your different(other dialect) connection. Should work good i think, even if i never did this. But i know JPA is not new and widely used for the sake of changing the database even when the software is already in production. This approach is a bit inperformant since every row gets transformed into POJO and there is, afaik, no bulk insertion available.

Henning Luther
  • 2,067
  • 15
  • 22
4

If you are looking for SQL generation then there are many sqlbuilder libraries available, which you can use.

You can use metadata to get column names and type from the select * query and use it at insert query.

See https://github.com/jkrasnay/sqlbuilder

More about it at http://john.krasnay.ca/2010/02/15/building-sql-in-java.html

ArunPrakash
  • 138
  • 1
  • 7
4

If your need is to export tables from a Oracle database to insert it back into different types of Database I would suggest a different approach.

This is the perfect use case for JPA (Java Persistence API) which allows you create a model that represent your database structure. This is the Java current solution solution for managing different types of database.

From your model you will be able to generate request compatible with all popular databases.

So my suggestion is, using Spring Boot + Spring Data + Spring Batch :

  1. Create a first app from your model that exports the content of your tables to CSV format.
  2. Create a second app from the same model that imports your CSV files. Depending on you jdbc url, Spring Boot will automtically trigger the appropriate dialect for your target Database and generate the right queries (this is also the case for the export).

This can be done within a reasonnable amount of time and with decent performance.

teikitel
  • 700
  • 5
  • 16
2

I need to export data from these tables programmatically using java

Come on guy! What is a matter? Java is a tool to operate data, not to migrate. If it's about ETL - please use ETL environments of the target DBMS or write an ETL code by your own.

Yuriy Tsarkov
  • 2,461
  • 2
  • 14
  • 28
2

For MSSQL and ORACLE you can use MERGE Tool syntax (and USING clause for Data) for ANSI standards:

MERGE INTO tablename USING table_reference ON (condition)
   WHEN MATCHED THEN
   UPDATE SET column1 = value1 [, column2 = value2 ...]
   WHEN NOT MATCHED THEN
   INSERT (column1 [, column2 ...]) VALUES (value1 [, value2 ...]);

For MySQL there is a different syntax (On Duplicate Key Update Statement):

-- Insert new or merge into existing row.
MERGE INTO system_user target
USING (SELECT   1 AS system_user_id
       ,       'SYSADMIN' AS system_user_name
       ,        1 AS system_user_group_id
       ,        1 AS system_user_type
       ,       'Samuel' AS first_name
       ,        'the' AS middle_name
       ,       'Lamanite' AS last_name
       ,        1 AS created_by
       ,        SYSDATE AS creation_date
       ,        1 AS last_updated_by
       ,        SYSDATE AS last_update_date
       FROM     dual) SOURCE
ON (target.system_user_id = SOURCE.system_user_id)
WHEN MATCHED THEN
  UPDATE SET first_name = 'Samuel'
  ,          middle_name = 'the'
  ,          last_name = 'Lamanite'
  ,          last_updated_by = 1
  ,          last_update_date = SYSDATE
WHEN NOT MATCHED THEN
  INSERT
  ( target.system_user_id
  , target.system_user_name
  , target.system_user_group_id
  , target.system_user_type
  , target.first_name
  , target.middle_name
  , target.last_name
  , target.created_by
  , target.creation_date
  , target.last_updated_by
  , target.last_update_date )  
  VALUES
  ( SOURCE.system_user_id
  , SOURCE.system_user_name
  , SOURCE.system_user_group_id
  , SOURCE.system_user_type
  , SOURCE.first_name
  , SOURCE.middle_name
  , SOURCE.last_name
  , SOURCE.created_by
  , SOURCE.creation_date
  , SOURCE.last_updated_by
  , SOURCE.last_update_date );

AND:

-- Insert new or merge into existing row.
INSERT INTO system_user
( system_user_name
, system_user_group_id
, system_user_type
, first_name
, middle_name
, last_name
, created_by
, creation_date
, last_updated_by
, last_update_date )
VALUES
('SYSADMIN'
, 1
, 1
,'Samuel'
,'the'
,'Lamanite'
, 1
, NOW()
, 1
, NOW())
ON DUPLICATE KEY 
UPDATE first_name = 'Samuel'
,      middle_name = 'the'
,      last_name = 'Lamanite'
,      last_updated_by = 1
,      last_update_date = UTC_DATE();
2

I would try Scriptella. Is an open source ETL and script execution tool written in Java, on which you define in a xml file the source and target connections and transformations if needed. Connections can be JDBC or even to text files and there features for batching support. The resulting xml file can be processed programmatically with java, ant or command line.

In their two minute tutorial there are examples for copy tables to another database and working with BLOBs.

Sal
  • 1,307
  • 1
  • 8
  • 16
2

I like "pro-grammatically" very much. :)

Best way to export that data is to iterate over tables, then query each of them and output plain text with insert into statements. It can be problem if you have binary data there, since different RDBS can deal with it in a slightly different way.

Reading blob/clob at Java side means reading stream. It can be binary, or character stream. For Oracle, from the doc - you can do:

        ResultSet rs = s.executeQuery(
            "SELECT text FROM documents WHERE id = 1477");
        while (rs.next()) {
            java.sql.Clob aclob = rs.getClob(1);
            java.io.InputStream ip = rs.getAsciiStream(1);
            int c = ip.read();
            while (c > 0) {
                System.out.print((char)c);
                c = ip.read();
            }
            System.out.print("\n");
        }

From this answer, you can make it shorter:

Clob clob = resultSet.getClob("CLOB_COLUMN")
String clob_content = clob.getSubString(1, (int) clob.length());

Writing output perhaps would require dealing with: \t\n\r. Depends on your needs, content. The docs have full example - reading, writing. They use prepared statement, and that is why streams are needed at both ends. If your clob is not big - like 32k/64k - there can be other limits. If you have any example - like create table with 2-3 rows it would be much easier by anyone to write code, and provide something that works...

Michał Zaborowski
  • 3,911
  • 2
  • 19
  • 39