0

I have a JSON file like this:

{"start":1489730400000, 
"end":1489733999999, 
"interval":1000, 
"weight":1, 
"augmented": true, 
"profileName":"Selene/prod", 
"prunedSamples":0, 
"fleet":{"c4.2xlarge":14.313278698132972}, "costPerSecond":0.000008259246540496541, 
"profileData":{
    "name":"ALL", 
    "states":{"BLOCKED":2281, "NEW":0, "RUNNABLE":125833, "TERMINATED":0, "TIMED_WAITING":23170429, "WAITING":59901416},
    "location": "0", 
    "hidden": [], 
    "children":[{"name":"GarbageCollector.gc", 
    "states":{"BLOCKED":0, "NEW":0, "RUNNABLE":17069}, 
    "location": "0.0", 
    "hidden": [], 
    "children":[{"name":"ConcurrentMarkSweep.gc", 
                "states":{"BLOCKED":0, "NEW":0, "RUNNABLE":14977}, 
                "location": "0.0.0", 
                "hidden": [], 
                "level": 1},
                {"name":"ParNew.gc", 
                 "states":{"BLOCKED":0, "NEW":0, "RUNNABLE":2092}, 
                 "location": "0.0.1", 
                 "hidden": [], 
                 "level": 1}]
}}

This is just a part of it. I get a much bigger file in GZip format which i decompress first and store the decompressed part in a string. I use the following code for that:

URL url = new URL("http://example.com/Selene%20Prod?start=1490234400000&end=1490237999999&maxDepth=200&minimumCountsThreshold=0.00");
URLConnection myUrlConnection = url.openConnection();
GZIPInputStream gZIPInputStream = new GZIPInputStream(myUrlConnection.getInputStream());
StringBuffer decompressedStringBuffer = new StringBuffer();
int bytes_read;
while ((bytes_read = gZIPInputStream.read(buffer)) > 0) {
    String part = new String(buffer, 0 ,bytes_read, "UTF-8");
    decompressedStringBuffer.append(part);
}
gZIPInputStream.close();
String decompressedString = decompressedStringBuffer.toString();
JSONObject obj = new JSONObject(decompressedString);
JSONArray profileData = obj.getJSONObject("profileData").getJSONArray("children");

My code gives Caused by: java.lang.OutOfMemoryError: Java heap space on decompressedStringBuffer.append(part);. Since the file gets too big to store in memory I thought about storing it in a file and then reading the file back to convert to JSON but then the JSON object I would create using the FileInputStream would give me a Caused by:java.lang.OutOfMemoryError: Java heap space.

The only useful data in the JSON I get is name and children under the profileData key.

Is there a way to only convert them to JSONObject while parsing the inputStream and ignore the others?

If there is a better way someone can think of i would appreciate that as well.

Belphegor21
  • 454
  • 1
  • 5
  • 24

1 Answers1

0

I think that you need not go to the JACKSON library just try to find a way to parse your large JSON in GSON only.

Try This resource Which is the answer from the author of GSON. On the same page you will find a working example for JACKSON too using streaming and tree model (in case you want to move on Jackson)

As well as refer the link below in case you want to increase your heap size: How to increase Java heap size

Community
  • 1
  • 1
Anant666
  • 416
  • 9
  • 26