2

I'm working with a SSIS package for importing from an Oracle Table to an SQL Server Table. for this in between I had to put a data conversion.

enter image description here the OLE DB Source is retrieving the complete Table, then being converted by the data conversion and then sent to the OLE DB Destination with current setup

enter image description here

now, the table I'm trying to import has around 7.3 Million records with 53 columns.

I need to know how can I setup (or what changes should do to current setup) to speed up as much as possible this process.

This package is going to run scheduled as a job in the SQL server agent.

In the last run inserted 78k records in 15 minutes. at this pace is too slow.

I believe I have to tune setting with the "rows per batch" and "maximum insert commit size" but looking around I haven't found information about what settings should work, and I've tried different settings here, not finding actual difference between them.

UPDATE: After a bit more test, the delay is from getting records from Oracle, not to insert them into SQL server. I need to check on how can I improve this

Baldie47
  • 1,148
  • 5
  • 16
  • 45
  • 1
    How do you know that it's not Oracle slow serving data on the source side? How much data is in each row? What are the specs on the machine running SSIS? The size of the network pipe? There's a lot of variables involved here. – Jacob H Sep 26 '19 at 16:32
  • I'm working with same. So will keep an eye out on this. Some things I have observed on my project are 1) Depending on time of day package is ran with other ETL jobs running time can greatly vary i.e. +- 50 mins for loading of 6+M Rows from 11g to SQL 2016. 2) Depends if both your servers are stand alone or part of VM wear (whole separate can of worms). 3) Data being inserted on SQL side if table has COLUMN STORE INDEX; inserts will be slow. – junketsu Sep 26 '19 at 16:33
  • @JacobH The Oracle being the slow one is interesting, it would be useful to pinpoint if maybe the issue doesn't come from Oracle being the Slow one... – Baldie47 Sep 26 '19 at 19:04
  • 1
    I recommend you check the following settings [DefaultBufferMaxRows] and [DefaultBufferSize] – arce.est Sep 26 '19 at 19:57
  • After a bit more test, the delay is from getting records from Oracle, not to insert them into SQL server. I need to check on how can I improve this – Baldie47 Sep 26 '19 at 22:59
  • You might want to consider not using one of the faster connectors [See this technet article SSIS With Oracle Connectors](https://social.technet.microsoft.com/wiki/contents/articles/1957.ssis-with-oracle-connectors.aspx) Note: They are not free – Conrad Frix Sep 26 '19 at 23:26
  • @arce.est is right - these settings make a huge difference to the speed that data comes from Oracle and default values are usually incorrect / inefficient – Nick.Mc Sep 27 '19 at 04:48
  • @gasguirre is the issue solved? since from your other question it looks like you have used oracle attunity connectors – Yahfoufi Oct 01 '19 at 11:25
  • @Yahfoufi it works now, I haven't updated this question with it since this was related to OLE DB and I used Attunity at the end. – Baldie47 Oct 01 '19 at 11:44

1 Answers1

1

I think that the main problem is not loading data into SQL Server, check the OLE DB provider you are using to extract data from Oracle.

There are many suggestions you can go with:

  1. Use Attunity connectors which are the fastest one available
  2. Make sure you are not using the old Microsoft OLEDB Provider for Oracle (part of MDAC). Use the Oracle Provider for OLEDB (part of ODAC) instead
  3. If it didn't work, try using an ODBC connection / ODBC Source to read data from Oracle
Hadi
  • 36,233
  • 13
  • 65
  • 124
  • for what I'm seeing the Attunity only works on Enterprise version of sql server, I have professional. I'll try with the ODAC to see if I have luck – Baldie47 Sep 27 '19 at 12:34