0

I'm facing a problem when trying to insert a large text object into oracle database using procedure.

My program has to insert thousands of items into oracle database, as the performance issue of inserting one-by-one, I wrote a oracle procedure and insert all item at once.

Ibatis:

<parameterMap id="EttAttrCall" class="java.util.Map">
        <parameter property="ettAttrLst" jdbcType="CLOB" javaType="java.lang.String" mode="IN"/>
        <parameter property="pjtId" jdbcType="VARCHAR2" javaType="java.lang.String" mode="IN"/>
        <parameter property="createUser" jdbcType="VARCHAR2" javaType="java.lang.String" mode="IN"/>
    </parameterMap>
    <procedure id="insertTmpEttAttr" parameterMap="EttAttrCall">
       { call INS_TMP_ETT_ATTR(?,?,?)}  
    </procedure>

oracle procedure:

create or replace
PROCEDURE INS_TMP_ETT_ATTR
(
  ettAttrLst IN CLOB,  
  pjtId IN VARCHAR2,
  createUser IN VARCHAR2
) 

Everything works fine if the data length is not exceeded 3 million characters, but if more than that number, oracle will throw this exception: ORA-06502: PL/SQL: numeric or value error: character string buffer too small

I tried to search over internet but not much useful information. So any advice or idea will be highly appreciated.

Thinhbk
  • 2,194
  • 1
  • 23
  • 34

2 Answers2

1

The limit for CLOBs is 4 gig, so I guess you're dealing with a Varchar2 limitation which is exceeded in your PL/SQL code.

You'll have to look at the line numbers in the error stack. Usually Oracle provides additional ORA-06512 errors which indicate these PL/SQL objects and line numbers of the error stack.

Just a wild guess in case you're using multibyte characters: take care whether you specify the Varchar2 size in bytes (the default) or characters. The limit for varchar2 is 32767 bytes (not characters!)

Edit after your comment below:

review this question for examples how to execute a batch insert

Community
  • 1
  • 1
HAL 9000
  • 3,877
  • 1
  • 23
  • 29
  • Tks, I wil lcheck. By the way, I wonder if is there any configuration in Spring for oracle data source / transaction to deal with high load data? – Thinhbk Mar 08 '12 at 04:36
  • none that i'm aware of. If your programm does 1 million insert statements without bind parameters there's no way to configure that to be a bulk load. Rule of thumb: 1000 rows w/o binds == 10.000 rows with binds == 1.000.000 rows bulk insert – HAL 9000 Mar 09 '12 at 14:25
  • I really don't get what you mean, can you explain in more detail? In my prog, I concatenate inserted value into a clob param, and call procedure to insert it, in procedure, I split clob param into objects then insert, that's it. – Thinhbk Mar 13 '12 at 10:39
  • Maybe you were right, my application use 2-bytes character (Korean character, it may make that error when the length of input string is bigger than length defined in DB. – Thinhbk May 22 '12 at 07:35
0

i was getting this error when trying to write string in the blob,so i converted it to byte[] like this:

 byte[] bytes = content.getBytes();//conntent is a String contains more than 100 line date
saba
  • 332
  • 2
  • 14