1

I converted most of the sub-projects of my application to the new Java Module system that cames with Java 9+.

Eventually, when I come to the one that uses Apache Spark, I fall into a trap. Spark modules seems to be only available with names like "spark.core.2.11" which have numbers inside and are refused my the compiler.

module fr.pays.spark {
   requires fr.pays.geographie;
   requires fr.pays.territoire;
   requires fr.pays.fondation.objetmetier;

   requires spring.beans;
   requires spring.boot;
   requires spring.web;

   requires spark.core.2.11; // rejected for the numbers inside
}

I've found this link as a response on Stackoverflow : Unable to derive module descriptor for auto generated module names in Java 9?. And I am thankful because it may be a solution (That I have to understand, and that I haven't tried yet).

However, it seems to me really clumsy. Aren't I misleading myself ? One year has passed since the release of Java 9, and I figure that Spark must have changed to become fully compliant to Java 9+ now.

What is proper way to reference Spark modules today (I use the 2.3.1 version, the latest I've found) ?

If there none better available than the one the link suggest, do you know have information about when Apache Spark plan to fully integrate with the Java 9+ module system ?

Thanks a lot !

Naman
  • 27,789
  • 26
  • 218
  • 353
Marc Le Bihan
  • 2,308
  • 2
  • 23
  • 41
  • They still don't have anything inlined as it [seems like](https://stackoverflow.com/questions/59844195/how-to-add-spark-dependencies-in-spring-boot-multi-module-java-11-project). – Naman Jan 21 '20 at 16:30
  • This question was asked two years ago. Is there a solution? – Jonathan Locke Aug 08 '20 at 03:05
  • I think not, I and I believe if any solution comes, it will be with the version 3.0 of Spark. But it is just released, and I haven't experienced it. – Marc Le Bihan Aug 08 '20 at 21:01

0 Answers0