2

I have a PL/SQL function to replicate GROUP_CONCAT() from MySQL. The function takes a CURSOR and returns a VARCHAR2. However, there are some objects in my application that have enough data that the size of the concatenated string is larger than 4000. Since I use this function in SELECT statements (and not just PL/SQL), this makes Oracle angry and throw an ORA-06502.

So, I changed the function to return a CLOB instead. This takes care of the errors, but when using JDBC to read the data, performance takes a massive hit. I'm reading a lot of data out and switching to CLOBs from VARCHAR2s has resulted in 10-20x slower execution time. I've been looking into optimizing this somehow, but since the CLOB is returned from a function and is not in a table, most of what I've read is not applicable.

Is there any way to improve this? I'd like to stress that this is in no way related to the actual database's performance; concatenating values up to 4000 characters is very fast, there are only a few objects that require more than that, and the biggest value is around 5000 characters. LOBs are generally optimized for large raw data, and if it weren't for Oracle's size restriction on columns existing in SELECT statements, I wouldn't need to do this.

EDIT - I want to reiterate that the CLOB is created in a function, it is not reading any CLOBs in the database. It is simply concatenating VARCHAR2s and returning the result as a CLOB.

monitorjbl
  • 4,280
  • 3
  • 36
  • 45
  • How are you getting back the data? Might want to take a look at this too: http://docs.oracle.com/cd/B19306_01/appdev.102/b14249/adlob_performance.htm – Mike Aug 22 '12 at 19:49
  • I'm getting all the data for each object in a single row. That's the reason I'm concatenating values instead of just reading multiple rows. Everything was performing great until I found that some objects would have more than 4000 characters in one of the concatenated columns. That Oracle document is not very helpful in this situation as I am creating the CLOB from a function solely because of the 4000 character VARCHAR2 limit; there are no CLOBs being stored in the database. If it were possible, I would not use a CLOB at all. – monitorjbl Aug 22 '12 at 20:07
  • I meant, are you using getString() or are you opening a stream to get the data? – Mike Aug 22 '12 at 20:44
  • Two articles with differing approaches. http://maclochlainn.wordpress.com/how-to-read-a-clob-through-jdbc/ and http://rocksolutions.wordpress.com/2010/06/07/handling-clobs-made-easy-with-oracle-jdbc-10g/ – Mike Aug 22 '12 at 20:50
  • Similar SO Question http://stackoverflow.com/questions/1525780/oracle-clob-performance – Mike Aug 22 '12 at 20:52
  • I am using getString(), but my understanding of the CLOB column type is that all LOBs are streamed by the driver. I was hoping there might be a way around that, as the performance of sending 5000 characters with the entire row can't be much worse than sending 4000. – monitorjbl Aug 23 '12 at 14:58

2 Answers2

0

I dont know how to solve your issue.. but 1 method to minimise your issue is to have 2 version of your function. 1 that returns varchar2 and one returning clob

your varchar2 version would could the clob version internally and return an exception code/raise error if the value is longer then 4000/32000 chars

your java code could then detect that and just re-call the clob version directly for the small number of cases that require it.

ShoeLace
  • 3,476
  • 2
  • 30
  • 44
0

You can convert your CLOB into a VARCHAR before returning it. The max size of a VARCHAR in PLSQL is 32k. If 32k isn't large enough store your results into a temp table and read from it using JDBC. That will be faster than going through the CLOB network protocol.

Jean de Lavarene
  • 3,461
  • 1
  • 20
  • 28