I have a list of around 40,000 objects on average (it bumps upto 80K at most) wrapped in Spring Page
Page<People> people = //loaded data using PeopleRepository of "People" entity
Then generating a map to export the records in excel file
List<Map<String, String>> excelData = Streams.batches(people, 500)
.parallel()
.map(PageImple::new) //Converting each chunk in a page
.map(this::buildDTO) // Creating DTO objects from people entities
.map(Slice::getContent)
.flatMap(Collection::stream)
.map(this::buildExportData)
.collect( <----
Collectors.toList()
);
Above piece of code sporadically throws the following NPE. It is thrown from core java library. Anyone can give any hint what might be causing the issue? Is it related to parallem stream? Do I need any other mechanism for parallel streaming?
Exception
java.lang.NullPointerException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(DelegatingAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:598)
at java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:677)
at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:735)
at java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:714)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
On a different but similar scenario I also observed the following exception,
org.hiberate.AssertionFailure: bug adding collection twice
at org.hibernate.engine.internal.StatefulPersistenceContext.addCollection(StatefulPersistenceContext.java:851)
at org.hibernate.engine.internal.StatefulPersistenceContext.addInitializedConnection(StatefulPersistenceContext.java:890)
// -- Will add remaiing part of the stack trace if required.
Edit 2 (breaking news!) We found this same behaviour in two places. They both have one common line, loading a relationship from using Lazy fetching. The entity confir looks like below,
@ManyToMany(fetch=Fetch.Lazy)
@JoinTable(.. ... ...)
private Set<Classification> classifications
Is there anything going wrong using parallel stream with Lazy fetching?
Edit 1
After some more investigation, if I look at java.util.concurrent.ForkJointTask line#291
, the code looks like below,
try {
completed = exec();
} catch(Throwable rex) {
return setExceptionalCompletion(rex);
}
Looks like it means it failed to execute one of the map block. Therefore, as @Eugene asked in the comment if it is the only stacktrace, answer is no. There is a "Caused By" section. Here it goes,
Caused By: java.lang.NullPointerException: null
at org.hibernate.engine.loading.internal.LoadContexts.cleanup(LoadContexts.java:81)
at org.hibernate.engine.loading.internal.CollectionLoadContext.endLoadingCollections(CollectionLoadContext.java:202)
at org.hibernate.loader.plan.exec.process.internal.CollectionReferenceInitializerImpl.endLoading(CollectionReferenceInitializerImpl.java:154)
at org.hibernate.loader.plan.exec.process.internal.AbstractRowReader.finishLoadingCollections(AbstractRowReader.java:249)
at org.hibernate.loader.plan.exec.process.internal.AbstractRowReader.finishUp(AbstractRowReader.java:212)
at org.hibernate.loader.plan.exec.process.internal.ResultSetProcessorImpl.extractResults(ResultSetProcessorImpl.java:123)
at org.hibernate.loader.plan.exec.process.internal.AbstractLoadPlanBasedLoader.executeLoad(AbstractLoadPlanBasedLoader.java:122)
at org.hibernate.loader.plan.exec.process.internal.AbstractLoadPlanBasedLoader.executeLoad(AbstractLoadPlanBasedLoader.java:86)
...
at org.hibernate.internal.SessionImpl.initializeCollection(SessionImpl.java:2004)
...
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)