I converted most of the sub-projects of my application to the new Java Module system that cames with Java 9+.
Eventually, when I come to the one that uses Apache Spark, I fall into a trap. Spark modules seems to be only available with names like "spark.core.2.11" which have numbers inside and are refused my the compiler.
module fr.pays.spark {
requires fr.pays.geographie;
requires fr.pays.territoire;
requires fr.pays.fondation.objetmetier;
requires spring.beans;
requires spring.boot;
requires spring.web;
requires spark.core.2.11; // rejected for the numbers inside
}
I've found this link as a response on Stackoverflow : Unable to derive module descriptor for auto generated module names in Java 9?. And I am thankful because it may be a solution (That I have to understand, and that I haven't tried yet).
However, it seems to me really clumsy. Aren't I misleading myself ? One year has passed since the release of Java 9, and I figure that Spark must have changed to become fully compliant to Java 9+ now.
What is proper way to reference Spark modules today (I use the 2.3.1 version, the latest I've found) ?
If there none better available than the one the link suggest, do you know have information about when Apache Spark plan to fully integrate with the Java 9+ module system ?
Thanks a lot !