3

I have a simple Data Flow Task that is transferring 1M records from a source to destination. Both the Source table and the Destination table are Oracle tables. The data at the source is composed of two columns Id and Name as shown below -

create table SrcEmployeeData
(
   Id   int
  ,Name varchar(50) 
);

The values in it are (1,'Test') , (2,'Test') and so on till million. Destination table is a mirror of source table excepting 'Name' column is a not null column. At the source, I have updated Name column of 10 random rows to NULL. To capture these errors, I have created 4 destinations with proper error re-direction.

The first three are of the target table. First one has Rows Per Batch / Maximum insert commit size set to 50,000. Second one has the same values set to 10,000. The third one has the same values set to 1. The final DFT is used to capture the actual error records. All the three DFT's are having data access mode set to 'Table or view - fast load'.

It looks like this -

enter image description here

As the package starts executing, the propation of error record till the last table goes as expected. Excepting when it comes to the last record, the first DFT wiht 50,000 record set as a batch fails. When I look at the destination EmployeeData table, surprisingly I don't find even a single record. What is going wrong here?

These are the two errors I get -

[OLE_DST - Load Data (50,000 rows) [69]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0xFFFFFFFF.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE_DST - Load Data (50,000 rows)" (69) failed with error code 0xC0202009 while processing input "OLE DB Destination Input" (82). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

Here are the random rows having the NULL values -

enter image description here

I am using latest Oracle Provider for OLE DB and downloaded the latest Oracle Data Components for Oracle from here- link.


EDIT - Just to make this little more wierd. The last DFT wherein I am capturing the error rows are having all incorrect data as shown below (am using SQL Server destination table). I have run this test thrice so far and all the time I am having same issue. Any help?

enter image description here

Community
  • 1
  • 1
VKarthik
  • 1,379
  • 2
  • 15
  • 30
  • 2
    Remember each of these batch commits are transactions. So one failure in a batch and entire transaction is rolled back. The 9/10 is odd. Does your destination table have a unique index? That would allow exactly one of 10 NULLS leaving you with nine errors. Have a look at this [page](https://sqljgood.wordpress.com/2014/09/29/ssis-troubleshooting-error-1071607685/) which has may help. It discusses your exact `1071607685` error code. – Troy Witthoeft Aug 17 '17 at 02:57
  • Thanks @TroyWitthoeft for the link. I have updated the DFT's access mode to 'OpenRowset' and the data is getting rightly captured. – VKarthik Aug 17 '17 at 04:06

1 Answers1

3

Following the post linked by Troy Witthoeft, all I had to do was change the Data access mode modes for the three Oracle OLE DB Destinations to 'Table or view' i.e. OpenRowSet. The data then got rightly transferred and the errors too were captured exactly as needed.

VKarthik
  • 1,379
  • 2
  • 15
  • 30