0

i have tried spooling and also using the dbms_lob. still not able to export the clob columns which is having lenght of 45000

below is the code :

spool file header

set head off set verify off set echo off set pages 0 set long 20000 set longc 20000 set lines 32767 set trimspool on set feedback off set termout off

--- util script

SET serveroutput ON
DECLARE
  l_file utl_file.file_type;
  l_clob CLOB;
  l_buffer VARCHAR2(32767);
  l_amount binary_integer := 32767;
  l_pos INTEGER           := 1;
BEGIN
  SELECT mo_xml
  INTO l_clob
  FROM
    ( SELECT mo_xml, LENGTH(mo_xml) LEN FROM test_xml ORDER BY LEN ASC
    )
  WHERE rownum =1;
  l_file      := utl_file.fopen('test', 'TEMP.TXT', 'w', 32767);
  LOOP
    dbms_lob.read (l_clob, l_amount, l_pos, l_buffer);
    utl_file.put(l_file, l_buffer);
    utl_file.fflush(l_file);
    l_pos := l_pos + l_amount;
  END LOOP;
EXCEPTION
WHEN no_data_found THEN
  -- Expected end.
  IF utl_file.is_open(l_file) THEN
    utl_file.fclose(l_file);
  END IF;
WHEN OTHERS THEN
  IF utl_file.is_open(l_file) THEN
    utl_file.fclose(l_file);
  END IF;
  raise;
END;
/
  • Why [`SET LONG 20000`](https://docs.oracle.com/cd/E11882_01/server.112/e16604/ch_twelve040.htm#SQPUG087)? *Sets maximum width (in bytes)* – astentx Mar 10 '21 at 11:17
  • i tried to change the value but the issue is in the set line, cant able to print the values which is having length > 32767 – Bala krishnan Mar 10 '21 at 12:02
  • Ah, you are about writing into the file at some server directory? This has nothing with `spool` command of SQL*Plus so all the options at the beginning are not relevant at all (why have you included them?). So check this [answer](https://stackoverflow.com/a/25640512/2778710) – astentx Mar 10 '21 at 15:08
  • i have added that here to mention that , i have tried both ways, spooling through sql plus and through util script , but nothing worked – Bala krishnan Mar 10 '21 at 16:02

2 Answers2

1

Try using DBMS_LOB.CLOB2FILE instead.

declare
    v_clob clob;
begin
    select mo_xml
    into v_clob
    from (select mo_xml from test_xml order by dbms_lob.getlength(mo_xml))
    where rownum = 1;

    --In 12.2 and above:
    dbms_lob.clob2file(v_clob, flocation => 'test', fname => 'TEMP.TXT');
    --In 11.2 and below the procedure is in a different package:
    --dbms_xslprocessor.clob2file(v_clob, flocation => 'TEMP_DIR', fname => 'TEST.TXT');
end;
/
Jon Heller
  • 34,999
  • 6
  • 74
  • 132
  • I tired dbms_xslprocessor.clob2file yesterday, it will process only one row. In our table we have 2k records. We can go with the dbms_lob.clob2file / dbms_xslprocessor.clob2file if we have only one record. Thanks for your time on this. :-) – Bala krishnan Mar 11 '21 at 04:42
  • @Balakrishnan Do you want each of the 2K records to be in a separate file? Or are you trying to concatenate all of the records into a single file? – Jon Heller Mar 11 '21 at 05:00
0

Sorry to say that, but even the official documentation for utl_file.put says "The maximum size of the buffer parameter (it is the second one, where your "l_buffer" goes) is 32767 bytes". I believe you have to split lines first and flush them one by one to a file afterwards

ekochergin
  • 4,109
  • 2
  • 12
  • 19
  • i cannot split the lines because i am framing an xml which should be transferred to third party – Bala krishnan Mar 10 '21 at 16:03
  • @Balakrishnan Do you want to put all the XMLs from multiple rows into a single file or as different documents? Or as some nested structures inside big XML? – astentx Mar 11 '21 at 08:10