4

I'm trying to run a job which moves data from a set of tables in an Oracle connection to an Sql Server One. But I'm getting the following exception which causes the job stopping :

2017/04/04 11:00:56 - read from [DEMANDE].0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException: 
2017/04/04 11:00:56 - read from [DEMANDE].0 - Couldn't get row from result set
2017/04/04 11:00:56 - read from [DEMANDE].0 - 
2017/04/04 11:00:56 - read from [DEMANDE].0 - Unable to get value 'Integer(38)' from database resultset, index 3
2017/04/04 11:00:56 - read from [DEMANDE].0 - Overflow Exception
2017/04/04 11:00:56 - read from [DEMANDE].0 - 
2017/04/04 11:00:56 - read from [DEMANDE].0 - 
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.core.database.Database.getRow(Database.java:2367)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.core.database.Database.getRow(Database.java:2337)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.trans.steps.tableinput.TableInput.processRow(TableInput.java:145)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at java.lang.Thread.run(Thread.java:744)
2017/04/04 11:00:56 - read from [DEMANDE].0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException: 
2017/04/04 11:00:56 - read from [DEMANDE].0 - Unable to get value 'Integer(38)' from database resultset, index 3
2017/04/04 11:00:56 - read from [DEMANDE].0 - Overflow Exception
2017/04/04 11:00:56 - read from [DEMANDE].0 - 
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.core.row.value.ValueMetaBase.getValueFromResultSet(ValueMetaBase.java:4702)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.core.database.BaseDatabaseMeta.getValueFromResultSet(BaseDatabaseMeta.java:2091)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.core.database.DatabaseMeta.getValueFromResultSet(DatabaseMeta.java:2901)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.core.database.Database.getRow(Database.java:2359)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   ... 4 more
2017/04/04 11:00:56 - read from [DEMANDE].0 - Caused by: java.sql.SQLException: Overflow Exception
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at oracle.sql.NUMBER.toLong(NUMBER.java:371)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at oracle.jdbc.dbaccess.DBConversion.NumberBytesToLong(DBConversion.java:2915)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at oracle.jdbc.driver.OracleStatement.getLongValue(OracleStatement.java:4373)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at oracle.jdbc.driver.OracleResultSetImpl.getLong(OracleResultSetImpl.java:529)
2017/04/04 11:00:56 - read from [DEMANDE].0 -   at org.pentaho.di.core.row.value.ValueMetaBase.getValueFromResultSet(ValueMetaBase.java:4660)

What the index 4 is pointing to is a column which has the type Number(38,0) in the oracle schema, the corresponding type for Sql Server table created by Pentaho is decimal(38,0).
The problem is that the very same job is running smoothly for other connections.
I'm using the classes12.jar as jdbc driver. I've tried using ojdbc6.jar (the one which is certified to work with jdk6, jdk7 and jdk8) but could not establish the connection to the database (Oracle 9i)

I've tried upgrading Pentaho Kettle to the latest stable release (Pentaho 7.0). But it's still giving the same problem.

What could be causing this problem ?

mounaim
  • 1,132
  • 7
  • 29
  • 56
  • 1
    Did you already look at the value which causes the error? You can create a text output step and connect it with the table insert step with the "in error case" connection line. – s3b Apr 05 '17 at 08:09
  • I've added a text file output in case of error after inserting data in my table ,but the text file output is empty after job execution, it doesn't contain the faulty data row. – mounaim Apr 05 '17 at 09:26
  • 1
    Have you tried a version of the Oracle JDBC driver older than ojdbc6.jar but newer than classes12.jar (i.e. ojdbc5.jar or ojdbc14.jar)? – Luke Woodward Apr 06 '17 at 21:37
  • 3
    Oh, this error happens on the table input step! Spoon receives a value and wants to parse it to LONG but fails, because that value is too big for LONG. You should try the approach @LukeWoodward mentioned. Try different drivers, maybe this helps. There seems to be no way to tell Spoon to which Datatype it should parses in the table input step. – s3b Apr 07 '17 at 08:59
  • I tried ojdbc5.jar and could not connect, with ojdbc14.jar I could connect but execution of the job resulted in the same issue as in the question – mounaim Apr 07 '17 at 09:44
  • 1
    sometime this error occur because there is change in culture/Collation.your script must be running on any specific order.Investigate via same order and see which row is last inserted then look for value in next row decimal column value.There must be someway to debug. – KumarHarsh Apr 11 '17 at 11:40
  • Looks like kettle maps Number(38,0) with this exact scale precision to Long if you specify the database as Oracle (https://github.com/pentaho/pentaho-kettle/blob/6.0.1.1-R/core/src/org/pentaho/di/core/row/value/ValueMetaBase.java#L4473). You can try following options: 1) Set your connection as JNDI connection, specify DB type as PostgreSQL, but specify Oracle's jdbc driver class. 2) Other option: try `Dynamic SQL row` instead of Table input, pass same query as before but in Template SQL cast your Number(38,0) to e.g. Number(37,0), so the type will be mapped to BigNumber. – Andrei Luksha Apr 13 '17 at 17:52

3 Answers3

0

Have you looked at below link from MS?

https://learn.microsoft.com/en-us/sql/relational-databases/replication/non-sql/data-type-mapping-for-oracle-publishers

  • I'm not doing SQL server Replication :) this was about some values in the Oracle table that couldn't​ be cast to long since they were greater than Long.MAX_VALUE. and that was causing the overflow exception mentioned in the question. I "solved" the issue by importing only data that are withing the long range, and the job runned smoothly, but I don't know why there isn't any jdbc mapping for values that are greater than Long.MAX_VALUE :/ – mounaim Apr 07 '17 at 16:52
  • @mounaim: the tricky bit is that `NUMBER(38, 0)` is the default in the database, but always mapping that to `BigDecimal` would not be an appropriate default since it would lead to slow and clumsy code. Generic code can't scan ahead, see that there are out-of-range values and then decide not to use `long` after all. `long` is a usually reasonable (but in this case incorrect) choice for the mapping, but `oracle.sql.NUMBER` certainly has a `toBigDecimal` method (or better, `toBigInteger`). But the database layer needs to support configuring such a mapping -- JDBC itself is not the issue. – Jeroen Mostert Apr 13 '17 at 15:47
0

I think your problem is that Oracle Number(38,0) is not equivalent to SQL Server Decimal(38,0).

From Oracle documentation:

The following numbers can be stored in a NUMBER column:

•Positive numbers in the range 1 x 10^-130 to 9.99...9 x 10^125 with up to 38 significant digits

•Negative numbers from -1 x 10^-130 to 9.99...99 x 10^125 with up to 38 significant digits

•Zero

•Positive and negative infinity (generated only by importing from an Oracle Database, Version 5)

SQL Server documentation states that Decimal(38,0):

Numeric data types that have fixed precision and scale. Decimal and numeric are synonyms and can be used interchangeably.

Arguments

decimal[ (p[ ,s] )] and numeric[ (p[ ,s] )] Fixed precision and scale numbers. When maximum precision is used, valid values are from - 10^38 +1 through 10^38 - 1

So Oracle Numeric can store upto 10^130 whereas SQL Server Decimal can store upto a maximum of 10^38. Hence the overflow.

You may want to use SQL Servers float data type which can store upto 1.79E+308

Steve Ford
  • 7,433
  • 19
  • 40
  • This is not the actual issue, because the code is failing on interaction with an Oracle database. SQL Server doesn't get into it at all. Even if you did change the SQL database to use `FLOAT`, this would not suddenly cause the value to fit in a Java `long`. Furthermore, using a floating-point type to store integer values is almost never appropriate -- few applications would be happy to trade exactness for range. – Jeroen Mostert Apr 13 '17 at 15:37
  • @JeroenMostert The Oracle documentation states that an Oracle Number can have up-to 38 significant digits even though it can hold a number up-to 9.99 X 10^125. Therefore a number with more than 38 significant digits cannot be stored exactly and therefore you must trade exactness for range for these numbers! – Steve Ford Apr 18 '17 at 10:26
  • Using `FLOAT` will lose exactness at far less than 38 digits (although, admittedly, exact arithmetic is possible on most of the range of a `long`). – Jeroen Mostert Apr 18 '17 at 10:39
  • @JeroenMostert I agree with you about FLOAT, because they are stored in binary rather than decimal they exhibit often confusing behavior. For example 0.1 is not representational in binary with finite precision as the representation has a sequence of 1100 repeated endlessly. When this is rounded to 24 bits this becomes 0.100000001490116119384765625 in Decimal, so you are correct to be wary. See https://en.wikipedia.org/wiki/Floating-point_arithmetic – Steve Ford Apr 18 '17 at 11:11
0

Unable to get value 'Integer(38)' from database resultset, index 3

2017/04/04 11:00:56 - read from [DEMANDE].0 - Overflow Exception

First try : Have you tryed to read the value in a Big Integer? In Kettle I use it for large number, but i've tryed it only with Postgresql.

Second try : have you tryed to cast the value (via SQL) to a smaller integer number? Or also to a string and after convert it to a number

I know the second solution is not elegant, but sometimes it solves your problems.

Community
  • 1
  • 1
Daniele Licitra
  • 1,520
  • 21
  • 45