0
while(resultSet.next()){    
    for (int c = 1; c <= metaData.getColumnCount(); c++){    
        String columnName = metaData.getColumnName(c);    
        Object value = resultSet.getString(columnName);     
        record.put(columnName, value);    
    }    
}

above while loop runs for 5 million times multiplied by no of columns. is there any best way to implement an efficient way to handle it..?

Shree
  • 1
  • 1
  • 5
    How about change your design not to run **5 million times multiplied by no of columns** ? :D – Jude Niroshan Jul 19 '17 at 07:33
  • Is there any other way rather than just looping with while loop and forloop..? like converting entire resultset in to JSONObject directly..? – Shree Jul 19 '17 at 07:46
  • You have many options like, server side caching, fetch only needs from server, write stored procedures, add more filters from client side to narrow the queries etc. just look in abstract view and try to find a way. – Jude Niroshan Jul 19 '17 at 07:50

1 Answers1

0

You should probably do it with Pagination rather than sending a big json chunk back in the response. Not only it increases memory/CPU usage but also it incurs a lot if network traffic. Plus performance issues in Front End in processing and rendering such a big response.

In pagination, you would probably return first n records along with metadata (i.e. total number of records, total number of pages, current page etc). If you are using Spring, have a look at this pagination example.

At DB layer, you can implement it depending on database type (e.g. for MySQL, you can use LIMIT). This SO answer explains how to do a pagination in MySQL.

Darshan Mehta
  • 30,102
  • 11
  • 68
  • 102
  • is there any convenient way other than pagination for this..? – Shree Jul 19 '17 at 07:45
  • You could stream the data in bytes rather than wait for the whole `json` object to be constructed. But it is little complicated and requires change in a way Front End queries the data. – Darshan Mehta Jul 19 '17 at 07:49