4

I have below code in my PL/SQL procedure, which I called in API_XXX.put(it calls utl_file.put) in a while loop. And the l_xmldoc is CLOB from a function of getReportXML, which returns the xml clob.

the code I write to write xml into a file is like:

l_offset := 1;
    WHILE (l_offset <= l_length)
    LOOP
    l_char := dbms_lob.substr(l_xmldoc,1,l_offset);

    IF (l_char = to_char(10))  ---I also tried if (l_char=chr(10)) but it did not work 
    THEN
        API_XXXX.new_line(API_XXX.output, 1);
    ELSE
        API_XXXX.put(fnd_API_XXX.output, l_char);
    END IF;

    l_offset := l_offset + 1;
    END LOOP;

Please note that the API_XXX is the existing package which I am not able to modify, and this api calls fflush in the end of put.

API_XXX.put's part is like below("WHICH" is the first param):

         elsif WHICH = API_XXX.OUTPUT then
           temp_file := OUT_FNAME;
           utl_file.put(F_OUT, BUFF);
           utl_file.fflush(F_OUT);

API_XXX.new_line is like(LINES is the number of lines to write):

         elsif WHICH = API_XXX.OUTPUT then
           temp_file := OUT_FNAME;
           utl_file.new_line(F_OUT, LINES);
           utl_file.fflush(F_OUT);

I notice a that the put/new_line procedure in my customer's side will sometimes raise UTL_FILE.WRITE_ERROR for unknown reason(maybe due to the l_length is too large(up to 167465)) in the while loop from my customer.

I read Oracle PL/SQL UTL_FILE.PUT buffering . And I found that this is the same cause, my l_xmldoc is really large and when I loop it, I found that it is without a new line terminator so the buffer is up to 32767 even though I fflush every time.

So, how should I convert the l_xmldoc into a varchar with new line terminator.

PS: I confirmed that my customer is using Oralce 11g

Community
  • 1
  • 1
JaskeyLam
  • 15,405
  • 21
  • 114
  • 149
  • 1
    I suggest to check a file content and encoding or better use UTL_FILE.PUT_RAW. In past I had problems, where I tried to import excel files and import crashed for no reason, then I figured that the problem is when symbols are unknown. Bad encoding also can corrupt file too. – DARK_A Aug 13 '14 at 07:18
  • 1
    Hi @DARK_A , was you problem is UTL_FILE.WRITE_ERROR? – JaskeyLam Aug 13 '14 at 08:21
  • 1
    I'm also [using](https://bitbucket.org/janihur/orasql-ex/src/default/packages/jh_file.pkb) `utl_file.put_raw` to write my files. In general I first convert a clob to a blob and only then write it to a file. – user272735 Aug 13 '14 at 09:39
  • 1
    @user272735 , did you ever encounter my problem? Do you know why? are you sure that your solution works, and please also share your code in the answer. – JaskeyLam Aug 13 '14 at 09:42
  • 1
    I don't recall anymore what issues I had back then. But the code has been in production almost two years without issues so I'm positive it works. Luckily the files are small enough to be kept in the memory. – user272735 Aug 13 '14 at 10:48
  • 2
    I haven't tested this, but I suspect that when calling UTL_FILE.PUT or PUT_LINE the input length can't exceed the max allowed for a VARCHAR2 in PL/SQL (32767). – Bob Jarvis - Слава Україні Aug 13 '14 at 14:18
  • 1
    @BobJarvis but i fflush every time . – JaskeyLam Aug 13 '14 at 16:57
  • I don't remember what was my error. @BobJarvis Raised very good point. Try size up your input by static number off characters not by new line symbol. – DARK_A Aug 14 '14 at 11:01

2 Answers2

2
  1. Post the Oracle Version you are using! Or we can just guess around...

  2. Your fflush will not work as you expect - From the documentation:

    FFLUSH physically writes pending data to the file identified by the file handle. Normally, data being written to a file is buffered. The FFLUSH procedure forces the buffered data to be written to the file. The data must be terminated with a newline character.

  3. tbone is abolutely right the line TO_CHAR(10) is wrong! Just try SELECT TO_CHAR(10) FROM DUAL; you will get 10 which you then compare to a single character. A single character will never be '10' since 10 has two characters!

  4. Your problem is most likely a buffer-overflow with too large XML-Files, but keep in mind, also other problems on the target system can lead to write_errors, which should be handled.

Solutions

  • Quick&Dirty: Since you don't seem to care about performance anyways you can just close the file every X byte and reopen it with A for append. So just add to the loop:

    IF MOD( l_offset, 32000 ) = 0
    THEN
      UTL_FILE.FCLOSE( f_out );
      UTL_FILE.FOPEN( out_fpath, out_fname, f_out, 'a', 32767 );
    END IF;
    
  • Use the right tool for the right job: UTL_FILE is not suited for handling complex data. The only usecase for UTL_FILE are small newline-separated lines of text. For everything else you should write RAW bytes! (Which will also allow you porper control over ENCODING, which is currently just mini-vanilly-lucky-guess)

  • Write a Java-Stored-Procedure with NIO-Filechannels - fast, safe, nice... But be careful, your program might run 10 times as fast!

Falco
  • 3,287
  • 23
  • 26
  • So, if FFLUSH works fine if I only put a short varchar without a new line terminator? And your solution is not works for me although I agree that. 1. I have no idea which file is opened since my api created a random name for this file and open it for me, So I cannot reopen the file. 2. I can't put_raw since the API did not give me that(again, I have to use the api to open the file). 3. Java procedure is not the choice, I have to write in the plsql. – JaskeyLam Aug 28 '14 at 03:45
  • "Normally, data being written to a file is buffered. The FFLUSH procedure forces the buffered data to be written to the file. The data must be terminated with a newline character." Also, this statements from Oracle Doc is really confusing to me. It says when I fflush, it forces the data to be written to the file. But I ffulsh every single char, and this char does not have new line terminator. Why no problem exists for past tens of years? – JaskeyLam Aug 28 '14 at 03:49
  • @Jaskey Please understand the Statement: FFLUSH DOES NOTHING without a NEWLINE TERMINATOR. - IF and only IF there is a newline terminator, then fflush will flush everything in your buffer up to and including this newline terminator! - With Short varchar you don't get a problem, because FCLOSE will flush everything eventually! But with long text your buffer will run full! – Falco Aug 28 '14 at 12:56
  • @Jaskey If you just want to fix your problem and cannot change anything in other API, you have to change your code like so: `IF (l_char = char(10) OR MOD( l_offset, 32000 ) = 0 )` Then you will just insert a newline every 32K chars and it will work, but you will insert newlines into the file... – Falco Aug 28 '14 at 12:59
  • will this influnce the xml interpretation? – JaskeyLam Aug 28 '14 at 16:31
  • Yes it could be - you should probably rewrite the logic, so it will only insert line-breaks after closing tags, then there shouldn't be any problem... – Falco Aug 29 '14 at 08:49
1

Just a guess, but instead of "to_char(10)" you might try chr(10) to determine/write a newline. Not sure if this will solve your problem, but sometimes very long lines (without newlines) can cause issues.

For example:

declare
    l_clob clob;
    l_char char;
begin
    l_clob := 'Line 1' || chr(10) || 'Line 2' || chr(10);

    for i in 1 .. DBMS_LOB.GETLENGTH(l_clob)
    loop
        l_char := dbms_lob.substr(l_clob, 1, i);

        if (l_char = chr(10)) then
        --if (l_char = to_char(10)) then
            dbms_output.put_line('Found a newline at position ' || i);
        end if;
    end loop;

end;

Notice the difference between chr(10) and to_char(10). Easy enough to test if this solves your problem anyway.

tbone
  • 15,107
  • 3
  • 33
  • 40
  • 1
    Would you please share a code sample? also, my api calls fflush every time after put a single char . So we do not have write a line for now – JaskeyLam Aug 14 '14 at 01:52
  • I saw somebody upvoted this answer, do you mean that I replace to_char(10) into chr(10) will help? What is the difference? Would you please help to explain a little bit? – JaskeyLam Aug 14 '14 at 06:55
  • 1
    TO_CHAR(10) converts 10 (number) to 10 (character). CHR(10) converts decimal in to character, in your case new line symbol. – DARK_A Aug 14 '14 at 10:56
  • chr(10) is a newline. Its not the same as to_char(10), which converts the number 10 to the string '10'. So your code will never find a newline, and will never write a newline (your API code that calls utl_file.newline will never be called). And the fflush needs the newline terminator I believe. Its simple enough to fix your code and test anyway. – tbone Aug 14 '14 at 12:58
  • @tbone thank you!since i cannot easily test the issue , i will wait for the customer's feedback. I am amazed for your answer since this worked for years………but i will still try that. – JaskeyLam Aug 14 '14 at 14:59
  • @tbone , I tried that, but it did not work, the same error is raised still. Please note that the API flushes every time after I put(). So I guess the buffer or something is not relative. How can I get more help from this problem? This really frustrated since I work for this problem for months. – JaskeyLam Aug 21 '14 at 03:32
  • @tbone, I have a new finding that the the loop did not have encounter a new line terminator for long! So the buffer is large enough to fail. So I think the problem should be how to make my xmldoc into a varchar with new line terminator. – JaskeyLam Aug 22 '14 at 07:06
  • Hi @Jaskey , you'll want to check that you call fopen with a linesize of 32767 and that your buffer is within this limit (call new_line before hitting this limit). Or, instead of using processing char at a time, rewrite to load a string buffer of varchar2(32767) and then write out using put_line. – tbone Aug 22 '14 at 09:39