I have a scenario to migrate SQL Server tables(30- 40 tables) to Oracle. I Cannot depend on SSIS as the no of tables to be migrated to Oracle will change regularly and I cannot always create or update a DFT when ever there is a change in schema. Is there any other way where the movement of data can be handled dynamically and can work effectively ? Like using Python or any other Programming languages ?
-
1One solution is to link the servers and use database commands in SQL Server to push the data to Oracle. – Gordon Linoff Mar 11 '19 at 15:09
-
@GordonLinoff I have tried this. But moving 40 GB of data is taking a lot of time and we have a time constraint as this process has be to run daily. – Srikar mogaliraju Mar 11 '19 at 15:11
-
Have you looked into replication? https://learn.microsoft.com/en-us/sql/relational-databases/replication/sql-server-replication?view=sql-server-2017 – Sean Lange Mar 11 '19 at 15:12
-
@Hadi As i was busy with other initiatives i wasn't able to try your solution. I would definitely try and get back to you soon. – Srikar mogaliraju Mar 11 '19 at 17:36
-
@Srikarmogaliraju you do not migrate. A migration is a one time data extraction, in your case you build a dedicated ETL solution for a long time. This does not make sense. – Arthur Mar 11 '19 at 18:02
-
@Arthur I knew that i need to build an ETL solution for this. I was looking for ideas which can make this task more dynamic and effective. I cannot create a Dataflow task for each and every table as the maintenance would be hectic in case of schema changes which happens frequently in our case. – Srikar mogaliraju Mar 11 '19 at 18:10
-
1OK @Srikarmogaliraju so now when we know that we build a permanent data mover, to build such ETL that would respond to changes transparently in the both data sources is a huge undertaking and SSIS unfortunately is not that thing that would work without a major effort. Try to convince whoever dreamed about it not to do it with SSIS, but perhaps Alooma, or in the worst case scenario use [BIML](https://www.mssqltips.com/sqlservertip/3094/introduction-to-business-intelligence-markup-language-biml-for-ssis/) to re-generate the package dynamically in code (programmatically) before each time it runs. – Arthur Mar 11 '19 at 18:19
-
1Thanks for the suggestions @Arthur . I knew about BIML. Will look into Alooma. – Srikar mogaliraju Mar 11 '19 at 18:27
-
@Srikarmogaliraju take a look [here](https://stackoverflow.com/questions/55029854/how-to-manage-ssis-script-component-output-columns-and-its-properties-programmat/55031970#55031970) it may give you some insights – Hadi Mar 11 '19 at 19:08
-
If space is not an issue, you might fare well by restoring the sql database to oracle and then copying over the tables you need into another database. May be you can automate that to be done on a nightly basis using perl or other oracle scripts. Look at this link https://www.oracle.com/webfolder/technetwork/tutorials/obe/db/sqldev/r30/SQLServerMigration/SQLServerMigration.htm – J Sidhu Mar 11 '19 at 19:12
-
@Arthur i was going through Alooma and I see that Destination doesn't support Oracle DB. Can you provide some insight on this ? – Srikar mogaliraju Mar 12 '19 at 14:54
-
@Srikarmogaliraju sorry, my bad Oracle indeed appears as not supported for being the destination – Arthur Mar 12 '19 at 20:54
2 Answers
C# approach - SchemaMapper library
Since you are open to a solution using a programming language, i think you can benefit from SchemaMapper class library which is an open-source project published on GitHub. A full description can be found in the Readme file on the link above.
Important Note: Yesterday i added the support of reading data from databases (SQL Server , Oracle ...) and the ability to export data to Oracle.
In this answer i will provide information on importing SQL Server tables, create the appropriate SchemaMapper class for each one (since they have different schema and you need to import them to different schemas), and how to export data to Oracle.
//First of all list the tables names need to import
string[] TableNameFilter = new[] { "Table1", "Table2" };
//Create an instance of the oracle import class
SchemaMapper.Exporters.OracleExport expOracle = new SchemaMapper.Exporters.OracleExport(oracleconnectionstring);
//Create an SQL Server import class
using (SchemaMapper.Converters.SqlServerCeImport ssImport = new SchemaMapper.Converters.SqlServerCeImport(sqlconnectionstring))
{
//Retrieve tables names
ssImport.getSchemaTable();
//loop over tables matching the filter
foreach(DataRow drRowSchema in ssImport.SchemaTable.AsEnumerable().Where(x =>
TableNameFilter.Contains(x["TABLE_NAME"].ToString())).ToList())
{
string SQLTableName = drRowSchema["TABLE_NAME"].ToString();
string SQLTableSchema = drRowSchema["TABLE_SCHEMA"].ToString();
DataTable dtSQL = ssImport.GetDataTable(SQLTableSchema, SQLTableName);
//Create a schema mapping class
using (SchemaMapper.SchemaMapping.SchemaMapper sm = new SchemaMapper.SchemaMapping.SchemaMapper(SQLTableSchema, SQLTableName))
{
foreach (DataColumn dc in dtSQL.Columns)
{
SchemaMapper_Column smCol = new SchemaMapper_Column();
smCol.Name = dc.ColumnName;
smCol.Name = dc.ColumnName;
smCol.DataType = smCol.GetCorrespondingDataType(dc.DataType.ToString(), dc.MaxLength);
sm.Columns.Add(smCol);
}
//create destination table in oracle
expOracle.CreateDestinationTable(sm);
//Insert data
expOracle.InsertUsingOracleBulk(sm, dtSQL);
//there are other methods such as :
//expOracle.InsertIntoDb(sm, dtSQL);
//expOracle.InsertIntoDbWithParameters(sm, dtSQL);
}
}
}
Note: this is an open-source project: it is not fully tested and not all data types are supported, if you encountered some errors feel free to give a feedback, or add an Issue in GitHub
Other approach - SQL Server Import and Export Wizard
If you can do this without scheduling a Job, then you can use the Import and Export Wizard which allows you to import multiple tables into Oracle without the need to build the packages manually. It will create packages, destination tables, map columns and import data.

- 36,233
- 13
- 65
- 124
-
1Nice!, but setting the connection on the exporter is better like you did on the importer – amd Mar 17 '19 at 12:33
-
-
1
-
@Hadi As the SQL server and Oracle are in 2 different servers and transferring of data is taking more time than expected using the C# approach, i have decided to go with a different approach. Step1: Dump the Sql Tables into flat files. Step 2: Transfer the files to the destination Server. Step 3: Load the data from files into Oracle using Sql Loader. For one 8 GB table it is taking 11 Minutes for the above 3 steps. – Srikar mogaliraju Mar 17 '19 at 16:46
-
Here is the approach I have decided to go considering the time constraint( using C# is taking more time).For 8 GB table it is taking 11 minutes to move the data SQL to Oracle.
Steps:
- Dump the SQL tables data into flat files.(Used BIML for automating the DFT creation)
- Transfer these flat files to the Destination server.
- Using SQL*Loader to load data from flat files to Oracle.

- 215
- 1
- 10