I have a PL/SQL function to replicate GROUP_CONCAT() from MySQL. The function takes a CURSOR and returns a VARCHAR2. However, there are some objects in my application that have enough data that the size of the concatenated string is larger than 4000. Since I use this function in SELECT statements (and not just PL/SQL), this makes Oracle angry and throw an ORA-06502.
So, I changed the function to return a CLOB instead. This takes care of the errors, but when using JDBC to read the data, performance takes a massive hit. I'm reading a lot of data out and switching to CLOBs from VARCHAR2s has resulted in 10-20x slower execution time. I've been looking into optimizing this somehow, but since the CLOB is returned from a function and is not in a table, most of what I've read is not applicable.
Is there any way to improve this? I'd like to stress that this is in no way related to the actual database's performance; concatenating values up to 4000 characters is very fast, there are only a few objects that require more than that, and the biggest value is around 5000 characters. LOBs are generally optimized for large raw data, and if it weren't for Oracle's size restriction on columns existing in SELECT statements, I wouldn't need to do this.
EDIT - I want to reiterate that the CLOB is created in a function, it is not reading any CLOBs in the database. It is simply concatenating VARCHAR2s and returning the result as a CLOB.