2

I'm using PL/JSON to parse data from MongoDB to Oracle DB. The Packages work fine with latin characters. However, whenver there is json_value with chinese characters, the resulted value in oracle is totally corrupted (question marks, symbole ... etc). As an example, I use the following line to parse:

Remarks := json_ext.get_string(json(l_list.get(i)),'Remarks');

I'm aware the type json and get_string function are using varchar and not nvarchar. More importantly, Oracle DB instance supports chinese (I can see when I insert directly into table, no corruption occurs. Only when I parse chinese json file it gets corrupted.) So my question is, Does PL/JSON software support chinesecharacters? Do I need to update the package on my own to accommodate chinese characters? Where exactly needs to be fixed?

In adidtion, I ran the following:

select value from nls_database_parameters where parameter='NLS_CHARACTERSET';

and it returns : AL32UTF8.

does not mean my DB is configured for UTF-8? or there are other things need to be checked for this pupose?

Update

Inspired by @jsumners' comment, I tried to track the corruption source and I found it from the beginning. That is, the snippet below is run before using PL/SJON to receive the htp response:

BEGIN
      LOOP
        UTL_HTTP.read_text(l_http_response, buf);
        DBMS_OUTPUT.PUT_LINE(buf);
        l_response_text := l_response_text || buf;
      END LOOP;
    EXCEPTION
    WHEN UTL_HTTP.end_of_body THEN
      NULL;
    END;

The line DBMS_OUTPUT.PUT_LINE(buf); returns corrupted fields of json. I'm not sure if this is related to the way I send and receive htp requests. Noteworthy, buf is nvarchar2.

Hawk
  • 5,060
  • 12
  • 49
  • 74
  • Does the corruption happen when you use the `.print()` method to print your JSON data to `dbms_output`? – James Sumners Dec 09 '14 at 03:28
  • In fact, it's corrupted from the beginning. Please see me updated question – Hawk Dec 09 '14 at 11:00
  • My next suggestion would be to read the data into a BLOB as raw data with `utl_http.read_raw`. Then write the contents of that BLOB to a text file using `utl_file`. At that point, you should be able to determine where the corruption is occurring. – James Sumners Dec 09 '14 at 13:34
  • It was solved by adding `utl_http.set_body_charset(l_http_request, 'UTF-8');`. Then, there was no issue with PL/JSON supporting UTF-8 :) – Hawk Dec 10 '14 at 01:45
  • 1
    Post your solution as an answer. – James Sumners Dec 10 '14 at 01:50
  • Maybe you also want to have a look at https://github.com/doberkofler/PLSQL-JSON – doberkofler Jul 11 '15 at 09:03

0 Answers0