5

Am getting:

ERROR [HY008] Operation canceled ERROR [08S01] Communication link failure

When trying to upload external .txt file to Netezza database. I have done this in the past (just last week), but today receive this error. Am able to connect to DB, ran truncate and create table statements for the loading of this data, can select, etc...But yet no luck loading. It's around 200K records and here is my code:

INSERT INTO PTG_ITO_ETL.FINANCE_TY15_RT_TPG
SELECT * FROM EXTERNAL 'C:\\Users\\Documents\\Data Sources\\Finance_FY15_RT\\SBTPG\\TPG_INTUIT_RT_PRODIV_20150214.TXT'
USING
(
    MAXERRORS 1
    DATESTYLE 'MDY'
    DATEDELIM '/'
    BOOLSTYLE 'Y_N'
    Y2BASE 2000
    ENCODING 'internal'
    SKIPROWS 1
    REMOTESOURCE 'ODBC'
    ESCAPECHAR '\'
)

Have tried solution from only other post I could find on the subject:

ERROR [08S01] Communication link failure while inserting data in to external table in netezza

"I have found in Windows 7 and Windows Server 2008 R2 TCP Chimney Settings were the culprit.

http://blogs.dirteam.com/blogs/sanderberkouwer/archive/2008/05/15/backward-compatible-networking-with-server-core.aspx

The Follwing Commands Fixed this issue for me:

netsh interface tcp set global rss=disabled
netsh interface tcp set global chimney=disabled
netsh interface tcp set global autotuning=disabled"

But, to no avail. Don't know what is causing this issue? Am on windows 7 using Aginity, Netezza version 7.0.4 Thanks!

Thanks, Craig

Community
  • 1
  • 1
thagraybush
  • 131
  • 1
  • 6
  • How long does it take before you get that error? Also, what specific version of the ODBC driver are you using? Without knowing what's going on in the ODBC trace, it's hard to say. However, a bug with similar symptoms to what you report was addressed in the 7.0.4.7 client package. – ScottMcG Feb 17 '15 at 22:34
  • If you add `LogDir 'C:\\Users\\Documents\\Data Sources\\Finance_FY15_RT\\SBTPG\\'` do you get an .nzlog / .nzbad file. If you up your `maxerrors` to 100 do you get those files? if so, what is the error message? – Niederee Feb 18 '15 at 01:31
  • @ScottMcG, Anywhere from 15s (fastest I've seen) to 86s (longest it's taken). Specific Driver is Version 7.01.00.36189. Maybe I will try to get updated one from IBM, will let you know how it goes. Thanks! – thagraybush Feb 18 '15 at 18:33
  • @Niederee, I tried that before posting this. It does not print a log file, just gives the errors I specified above. I've also taken max errors up, it doesn't make a difference, the same error is returned in the time frames specified above. Guessing the driver issue suggested by ScottMcG is the issue. Thanks! – thagraybush Feb 18 '15 at 18:36
  • @ScottMcG, downloaded latest driver. Definitely 'tries' harder, still errored out with the same error a couple times. But also worked a few times as well. What's the point of external tables if they can't be very large lol. Thanks for your help. – thagraybush Feb 18 '15 at 21:45

2 Answers2

2

Please try the following and see if it errors

    SELECT * FROM EXTERNAL 'C:\\Users\\Documents\\Data Sources\\Finance_FY15_RT\\SBTPG\\TPG_INTUIT_RT_PRODIV_20150214.TXT'
(field1 varchar(20000))
    USING
    (
        MAXERRORS 1
        Delim 199
        DATESTYLE 'MDY'
        DATEDELIM '/'
        BOOLSTYLE 'Y_N'
        Y2BASE 2000
        ENCODING 'internal'
        SKIPROWS 1
        REMOTESOURCE 'ODBC'
        ESCAPECHAR '\'
    )
Niederee
  • 4,155
  • 25
  • 38
  • I am having the same error (08S01) but I am using ODBC driver version 7.2.0.3. My txt file contains approx. 25M rows of data (392Mb). What is the resolution to this issue? Thanks. – TheGoat Oct 21 '18 at 22:27
1

Appreciate the Help from @ScottMcG, after downloading the latest Netezza Driver (7.0.4.7) I was able to complete this job. Still errored out (with same error) a couple times but also worked many with extensive testing.

I read another post similar and this error seems common the larger the records go. I read one where person was using an external table with a million+ records and errored out every time. If need be I (you) may need to split the data into smaller external tables for loading to one table/DB. Seems to start having problems around 200k+.

This is also from my laptop, so maybe much higher if you were on a box inside the Data Center. I would suggest the driver upgrade and then splitting to smaller files if necessary.

Thanks All!

thagraybush
  • 131
  • 1
  • 6
  • As one last long shot idea, you could try switching Aginity over to the OLE-DB provider at connection time. It is packaged in the same windows client package as the ODBC driver. – ScottMcG Feb 19 '15 at 20:20