I am creating a REST-Api with spring boot, that needs to transfer enormous JSON-Data to a client. To avoid extreme memory usage or too great array instantiations, I am using StreamingResponseBody to send the data in chunks:
@Controller
class FooController{
@Autowired
FooService fooService;
//...
private ResponseEntity<StreamingResponseBody> getPropertyVariants(@PathVariable(required = false) String propertyName, @RequestParam(required = false) String instruction) throws JsonProcessingException
{
StreamingResponseBody streamingResponseBody = out -> {
if (propertyName == null) fooService.writeReportToOutStream(out);
else (fooService.writeReportToOutStream(propertyName, out);
};
return ResponseEntity.ok().contentType(MediaType.APPLICATION_JSON).body(streamingResponseBody);
}
}
FooService has a massive array of data which it filters and then writes into the Stream using a JsonGenerator. I can not show the actual service here. Rest assured, the resulting Json-Array is written into the stream entry by entry, everything is properly flushed, and the JsonGenerator is closed. If my output array contains around 100000 entries, everything works fine. However, if I increase that output to 1000000 entries, the request fails in the middle of transfer.
Parts of the Stacktrace:
org.apache.coyote.CloseNowException: Failed write
at org.apache.coyote.http11.Http11OutputBuffer$SocketOutputBuffer.doWrite(Http11OutputBuffer.java:548)
at org.apache.coyote.http11.filters.ChunkedOutputFilter.doWrite(ChunkedOutputFilter.java:110)
at org.apache.coyote.http11.Http11OutputBuffer.doWrite(Http11OutputBuffer.java:193)
at org.apache.coyote.Response.doWrite(Response.java:606)
at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:340)
at org.apache.catalina.connector.OutputBuffer.flushByteBuffer(OutputBuffer.java:783)
at org.apache.catalina.connector.OutputBuffer.doFlush(OutputBuffer.java:299)
at org.apache.catalina.connector.OutputBuffer.flush(OutputBuffer.java:273)
at org.apache.catalina.connector.CoyoteOutputStream.flush(CoyoteOutputStream.java:118)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator.flush(UTF8JsonGenerator.java:1178)
at com.fasterxml.jackson.databind.ObjectMapper.writeValue(ObjectMapper.java:3060)
at com.fasterxml.jackson.core.base.GeneratorBase.writeObject(GeneratorBase.java:388)
at de.jmzb.ecomwdc.service.properties.FooService.lambda$null$2(FooService.java:37)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:485)
at de.jmzb.ecomwdc.service.properties.FooService.lambda$filter$10(KeyFilterStrategy.java:34)
at java.util.ArrayList.forEach(ArrayList.java:1259)
at de.jmzb.ecomwdc.service.properties.PropertyReportService.writeReportToOutStream(FooService.java:56)
at de.jmzb.ecomwdc.controller.WDCDataSourceController.lambda$getPropertyVariants$1(FooController.java:77)
at org.springframework.web.servlet.mvc.method.annotation.StreamingResponseBodyReturnValueHandler$StreamingResponseBodyTask.call(StreamingResponseBodyReturnValueHandler.java:111)
at org.springframework.web.servlet.mvc.method.annotation.StreamingResponseBodyReturnValueHandler$StreamingResponseBodyTask.call(StreamingResponseBodyReturnValueHandler.java:98)
at org.springframework.web.context.request.async.WebAsyncManager.lambda$startCallableProcessing$4(WebAsyncManager.java:337)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Error on httpie:
HTTP/1.1 200
Connection: keep-alive
Content-Encoding: gzip
Content-Type: application/json
Date: Thu, 27 May 2021 15:21:46 GMT
Keep-Alive: timeout=60
Transfer-Encoding: chunked
Vary: origin,access-control-request-method,access-control-request-headers,accept-encoding
http: error: ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))
I hope to be able to treat any size of data once my input for the service no longer needs to be stored in memory. Apparently the OutputStream is no longer working at some point? Idea 1: For some reason, some chunk of the transfer has the wrong length, the client cancels the request, and the OutputStream closes. Therefore, the server has an exception. Idea 2: For some reason, the server stops the OutputStream, leading to an unexpected end of the response within the client.
Any ideas on how to solve this? I am sorry that I cannot share my original code.
Thanks for any help!