3

Whenever I am adding a module-info.java in my multi-module project I cannot import my Spark dependencies - everything else seems to be working

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.12</artifactId>
    <version>3.0.0-preview2</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.12</artifactId>
    <version>3.0.0-preview2</version>
</dependency>

intellij import package screenshot

IntelliJ tries to readd Maven Dependency without any result.

My module-info looks like:

module common {
    exports [...] 
    requires lombok;
    requires spring.data.jpa;
    requires spring.data.commons;
    requires org.apache.commons.lang3;
    requires spring.context;
    requires spring.web;
    requires spring.security.core;
    requires com.google.common;
    requires org.json;
    requires spring.core;
    requires spring.beans;
    requires com.fasterxml.jackson.core;
    requires com.fasterxml.jackson.databind;
    requires spring.jcl;
    requires spring.webmvc;
    requires mongo.java.driver;
    requires org.hibernate.orm.core;
    requires com.fasterxml.jackson.dataformat.csv;
    requires java.sql;
}

It is not possible to add org.apache.* in my module-info.java either.

Is it possible that Spark is not ready for Jigsaw modules and Java 9+?

halfer
  • 19,824
  • 17
  • 99
  • 186
Michail Michailidis
  • 11,792
  • 6
  • 63
  • 106

1 Answers1

2

Is it possible that spark is not ready for Jigsaw modules and Java 9+?

It does hold true for spark. Two straight reasons that I can vouch for are:

  1. They do not have an entry for

    Automatic-Module-Name: <module-name> 
    

    in the artifact's MANIFEST.MF file.

  2. If you try describing their artifacts using the jar tool

    jar --describe-module --file=<complete-path>/spark-core_2.12-3.0.0-preview2.jar
    

    This would fail to derive the module descriptor for a similar reason as mentioned in this answer.


Few resources that might be useful once you reach here:

Naman
  • 27,789
  • 26
  • 218
  • 353
  • 1
    Side note: Even when spark might be ready to be [build and run with JDK-11](https://issues.apache.org/jira/browse/SPARK-24417), they do not expose artifacts that can be used on modulepath by others. – Naman Jan 21 '20 at 16:05
  • so it is not possible to use a module with java 8 and the rest of the project to be let's say 11? In other words anything that depends on Spark needs to be downgraded to java 8 ? Or I could extract it as jar and use it in class path somehow? I thought jigsaw modules were optional with java >= 9 but it seems they are a requirement :( – Michail Michailidis Jan 21 '20 at 16:06
  • 1
    @MichailMichailidis There are multiple considerations to it. First, if you are planning to create your application to be modular i.e. include a `module-info.java` in itself with JDK-11, then the only way would be updating the spark jar to manually add the entry in `MANIFEST.MF`. Second, if you plan to run on JDK-11, but using classpath, then you don't need a `module-info.java` and things should work fine without it as with Java-8. Then, if creating a modular application was the *only purpose* for you to migrate, you can very well remain with Java-8. You can choose accordingly for your app. – Naman Jan 21 '20 at 16:15