0

I have very big .json file (over 5GB) and I want to convert it to .xml format. Is there any software or way (it can be everything) to do that?

I found XML Editor

but there is just xml => json converter.

Marged
  • 10,577
  • 10
  • 57
  • 99
denied
  • 311
  • 1
  • 5
  • 18
  • 2
    See this post: [http://stackoverflow.com/questions/8988775/convert-json-to-xml-in-python](http://stackoverflow.com/questions/8988775/convert-json-to-xml-in-python) Someone did it for you. In Python. – dsXml Jan 07 '16 at 23:08
  • You can try to find an answer to this on softwarerecs – Marged Jan 07 '16 at 23:10
  • You can use **xml2json-converter** (https://sourceforge.net/projects/xml2json-converter/) tool with GUI which is using [Staxson](https://github.com/beckchr/staxon) library and written in Java. This tool is OpenSource and can be found on [GitHub](https://github.com/AntonMykolaienko/xml2json) it also can be started via command line without GUI – Anton Mykolaienko Jun 09 '17 at 09:19

2 Answers2

2

The typical way to process XML as well as JSON files is to load these files completely into memory. Then you have a so called DOM which allows you various kinds of data processing. But neither XML nor JSON are really designed for storing that much data you have here. To my experience you typically will run into memory problems as soon as you exceed a 200 MByte limit. This is because DOMs are created that are composed from individual objects. This approach results in a huge memory overhead that far exceeds the amount of data you want to process.

The only way for you to process files like that is basically to take a stream approach. The basic idea: Instead of parsing the whole file and loading it into memory you parse and process the file "on the fly". As data is read it is parsed and events are triggered on which your software can react and perform some actions as needed. (For details on that have a look at the SAX API in order to understand this concept in more detail.)

As you stated you are processing JSON, not XML. Stream APIs for JSON should be available in the wild as wel. Anyway you could implement one fairly easily yourself: JSON is a pretty simple data format.

Nevertheless such an approach is not optimal: Typically such a concept will result in very slow data processing because of millions of method invocations involved: For every item encountered you typically need to call a method in order to perform some data processing task. This together with additional checks about what kind of information you currently have encountered in the stream will slow down data processing pretty much.

You really should consider to use a different kind of approach. First split your file into many small ones, then perform processing on them. This approach might not seem to be very elegant, but it helps to keep your task much simpler. This way you gain a main advantage: It will be much easier for you to debug your software.

BHUK
  • 21
  • 2
1

If you're willing to use the XML Serializer from PEAR, you can convert the JSON to a PHP object and then the PHP object to XML in two easy steps:

check this link for more

convert json to xml

a little example

include("XML/Serializer.php");

function json_to_xml($json) {
    $serializer = new XML_Serializer();
    $obj = json_decode($json);

    if ($serializer->serialize($obj)) {
        return $serializer->getSerializedData();
    }
    else {
        return null;
    }
}

good luck and try

BlackHack123
  • 349
  • 1
  • 10