I have finite stream of n entries. It's count is unknown in advance. Data size is around 10Gb and it's to big to fit in RAM, so I cannot read it as a whole. What would be a way to process that stream in chunks after each 100000 entries?
Stream<?> blocks
so methods as subList are not available for me.
I could imagine it something like this in code:
IntStream
.range(0, Integer.MAX_VALUE)
.filter(s -> s % 100000 == 0)
.mapToObj(s -> blocks
.skip(s)
.collect(Collectors.toList())
.forEach(MyClass::doSomething)
);
But then I get error, because limit is terminal operator and it closes stream. Is there some workaround?