0

In my team, we have an issue with a specific endpoint which, when called with some specific parameters, provides a huge JSON in chunks. So, for example, if the JSON had 1,000 rows, after about 30 seconds of opening the URL with our browser (it's a GET endpoint) we get 100 rows, then wait a few more and we get the next 200, etc until the JSON is exhausted. This is a problem for us because our application times out before retrieving the JSON. We want to emulate the behavior of the endpoint with an example endpoint of our own, for debugging purposes.

So far, the following is what I have. For simplicity, I'm not even reading a JSON, just a randomly generated string. The logs show me that I'm reading the data a few bytes at a time, writing it and flushing the OutputStream. The crucial difference is that my browser (or POSTMAN) show me the data at the very end, not in chunks. Is there anything I can do to make it so that I can see the data coming back in chunks?


private static final int readBufSize = 10;
  
private static final int generatedStringSize = readBufSize * 10000;

@GetMapping(path = "/v2/payload/mocklargepayload")
  public void simulateLargePayload(HttpServletResponse response){
    try(InputStream inputStream = IOUtils.toInputStream(RandomStringUtils.randomAlphanumeric(generatedStringSize));
        OutputStream outputStream = response.getOutputStream()) {
      final byte[] buffer = new byte[readBufSize];
      for(int i = 0; i < generatedStringSize; i+= readBufSize){
        inputStream.read(buffer, 0, readBufSize - 1);
        buffer[buffer.length - 1] = '\n';
        log.info("Read bytes: {}", buffer);
        outputStream.write(buffer);
        log.info("Wrote bytes {}", buffer);
        Thread.sleep(500);
        log.info("Flushing stream");
        outputStream.flush();
      }
    } catch (IOException | InterruptedException e) {
      log.error("Received exception: {}", e.getClass().getSimpleName());
    }
  }
Jason
  • 2,495
  • 4
  • 26
  • 37
  • Have you tried StreamingResponseBody? https://stackoverflow.com/questions/59295514/how-to-stream-chunked-response-with-spring-boot-restcontroller – kerbermeister Feb 19 '23 at 14:47

1 Answers1

0

Your endpoint should return a header "content-length" where you will specify the total size of the info that your endpoint will return. That will inform your client of how much info to expect. Also, you can read info chunk by chunk as it becomes available. I had a reverse problem where I wrote a large input into my end-point (POST). And end-point was reading it faster than I was writing, so at some point when it read all the available info so far it stopped reading thinking it was it. So, I wrote this code which you can implement the same way on your client side:

@PostMapping
public ResponseEntity<String> uploadTest(HttpServletRequest request) {
    try {
        String lengthStr = request.getHeader("content-length");
        int length = TextUtils.parseStringToInt(lengthStr, -1);
        if(length > 0) {
            byte[] buff = new byte[length];
            ServletInputStream sis =request.getInputStream();
            int counter = 0;
            while(counter < length) {
                int chunkLength = sis.available();
                byte[] chunk = new byte[chunkLength];
                sis.read(chunk);
                for(int i = counter, j= 0; i < counter + chunkLength; i++, j++) {
                    buff[i] = chunk[j];
                }
                counter += chunkLength;
                if(counter < length) {
                    TimeUtils.sleepFor(5, TimeUnit.MILLISECONDS);
                }
            }
            Files.write(Paths.get("C:\\Michael\\tmp\\testPic.jpg"), buff);
        }
    } catch (Exception e) {
        System.out.println(TextUtils.getStacktrace(e));
    }
    return ResponseEntity.ok("Success");
}

Also, I wrote a general feature for read/write with the same problem (again for server-side) but again you can implement the same logic on client side as well. The feature reads the info in chunks as it becomes available. This feature comes with Open-source library MgntUtils (written and maintained by me). See class WebUtils. The library with source code and Javadoc is available on Github here. Javadoc is here. It is also available as Maven artifact here

Michael Gantman
  • 7,315
  • 2
  • 19
  • 36