11

There are some solutions here

Windows Spark Error java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils

The mentioned error probably corresponds to the following exception:

java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x12a94400) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module

Switching back to Java 11 or below is not a solution for me. How can this be solved with Java 17?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Arnd Eden
  • 111
  • 1
  • 3
  • You do not need any level of reputation to answer. Spark officially only supports Java 11 – OneCricketeer May 13 '22 at 15:25
  • 1
    As of Spark 3.3.0, Java 17 is supported -- however, it still references `sun.nio.ch.DirectBuffer` so the `--add-exports` mentioned in the answer below is still required. – Greg Kopff Jun 23 '22 at 05:12
  • 1
    Does this answer your question? [Spark 3.3.0 breaks on Java 17 with "cannot access class sun.nio.ch.DirectBuffer"](https://stackoverflow.com/questions/73465937/spark-3-3-0-breaks-on-java-17-with-cannot-access-class-sun-nio-ch-directbuffer) – werner Sep 02 '22 at 20:29

1 Answers1

9

You can solve this problem (and about each and every Java 17 problem) by adding a --add-exports statement to the corresponding Java call. In this case --add-exports java.base/sun.nio.ch=ALL-UNNAMED.

Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • I tried adding the export statement to maven-surefire-plugin in pom.xml but the same error message persist, do you know why? @OneCricketeer – Quan Bui Sep 02 '22 at 09:42
  • This text was written by @ArndEden in the original post and moved here as an answer – OneCricketeer Sep 02 '22 at 14:32
  • 1
    Solved this by adding the above `--add-exports` statement to `vmArgs` in launch.json on VSCode. Adding to maven-surefire-plugin in pom.xml did nothing for me. – Quan Bui Sep 05 '22 at 07:49
  • 1
    Also consider setting `-Dio.netty.tryReflectionSetAccessible=true` when using Apache Arrow library. --- Per https://spark.apache.org/docs/latest/index.html - For Java 11, -Dio.netty.tryReflectionSetAccessible=true is required additionally for Apache Arrow library. This prevents java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available when Apache Arrow uses Netty internally. – Quan Bui Sep 05 '22 at 07:54