10

I have json post data with below template

 {

    "themeId" : JSONString,
    "themeName" : JSONString,
    "tables" : [{
        "tableName" : JSONString,
        "records" : [{
            "recordVersion" : JSONString,
            "tableItems" : [] 
        }]
    }]

}

and on Java side I have REST API like this:

@POST
@Path("/{themeId}")
@Consumes({MediaType.APPLICATION_JSON})
public Response postTheme( @PathParam("themeId") String themeId, ThemeDictionary dictionary) throws InterruptedException {
    //code to handle
}

It worked fine when post data is less than 2 MB but how to handle data size bigger than 2 MB.

Questions

1) Should I go with pagination.

2) If I split json into half then each half won't be valid json. So, should I accept strings and concatnate on server side?

3) Are there any good examples to handle this scenario

4) Looking for an approach that can handle json data of size less than or greater than 2 MB

Brad
  • 9,113
  • 10
  • 44
  • 68
javaMan
  • 6,352
  • 20
  • 67
  • 94

5 Answers5

4

Pagination will not solve you problem since you are sending data to the server, and not receiving.

What servlet container do you use? It looks like default tomcat POST limit size.

If you are using standalone tomcat you need to set parameter maxPostSize for your Connector: see here or (here)

Community
  • 1
  • 1
Konstantin Konyshev
  • 1,026
  • 9
  • 18
2

2MB is rather small and I think the approach to upload the json file as multipart, then normally process the json file can handle the up to 50MB sized file. An example of handling file uploading can be found here.

For json files that is more than hundred of MBs, we have to find some way to process in streaming, or split the file into smaller files.

Duong Nguyen
  • 830
  • 6
  • 10
1

Pagination will be the good option but it will need manual intervention. Instead of that you can sent multiple Async request to fetch data (ie., fetch 1-200 records in one request and next request will fetch 200-400) like that but it is not recommended way since your server will get more request based on the number of records.

Venkadesh
  • 89
  • 7
1

Json files are great for compression. You should think about it.

Yes you should go with pagination. But there will be some cons of it. Such as consistency.

You should send them by not dividing into strings. I suggest you to send meaningful data. So pagination will be meaningful. If the one of the parts (blocks) of the message is missing, you should only re-send that part. Not the all parts.

"how can you eat a really big fish? - by slicing thin".

Try to post smaller & meaningful parts. Otherwise your servers will need more computing time to process data, your clients need more memory to process.

ykaragol
  • 6,139
  • 3
  • 29
  • 56
1

Is there any reason why you are not sending the data in one single request? Send the 50MB as one request. There is no limit on the size of data in JSON or HTTP post specification as discussed in the below SO questions

Is there a limit on how much JSON can hold?

Is Http POST limitless?

If you are worried about the performance of your server. One possible option is to split your json logically so that the action can be performed in smaller chunks.

For eg Consider your tables array has 200 items in it you can consider splitting the tables array into smaller chunks maybe say 50 /20 per requests.

{

    "totalPages":2,
    "themeId" : JSONString,
    "themeName" : JSONString,
    "tables" : [{
        //first 50 tables
        "tableName" : JSONString,
        "records" : [{
            "recordVersion" : JSONString,
            "tableItems" : [] 
        }]
    }]

}

Next request

{
    "totalPages":2,
    "themeId" : JSONString,
    "themeName" : JSONString,
    "tables" : [{
        //next 50 tables
        "tableName" : JSONString,
        "records" : [{
            "recordVersion" : JSONString,
            "tableItems" : [] 
        }]
    }]

}

If you do not need the complete data for the processing the request, you can perform the action on the data as it arrives. If not, add the tables array to some db/file/memory till the last page is received, and for the last request merge the json back together and process the request and send back the proper response. If its the second case there is not much performance improvement.

Community
  • 1
  • 1
Nithish Thomas
  • 1,515
  • 14
  • 20