1

I'm implementing a simple program in Java that uses Spark SQL to read from a Parquet file, and build an ArrayList of FieldSchema objects (in hive metastore) where each object represents a column with its name and data type. However, it seems that Spark SQL can't coexist if FieldSchema class is imported.

For example, with the same program as follows:

import org.apache.spark.sql.SparkSession;

public class main {
    public static void main(String[] args) {
        SparkSession spark = SparkSession.builder().appName("Application Name").config("spark.master", "local").getOrCreate();
    }
}

This configuration of dependencies in build.gradle (IntelliJ)

dependencies {
    testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0'
    testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0'

    implementation 'org.apache.spark:spark-sql_2.12:3.1.1'
    implementation 'org.apache.spark:spark-core_2.12:3.1.1'
}

makes the program run succesfully.

On the other hand, this configuration of dependencies (in order to import org.apache.hadoop.hive.metastore.api.FieldSchema later)

dependencies {
    testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0'
    testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0'

    implementation "org.apache.hive:hive-service:+"
    implementation 'org.apache.spark:spark-sql_2.12:3.1.1'
    implementation 'org.apache.spark:spark-core_2.12:3.1.1'
}

outputs the error

WARNING: An illegal reflective access operation has occurred
An illegal reflective access operation has occurred

WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/davidtran/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-unsafe_2.12/3.1.1/1c3b07cb82e71d0519e5222a5ff38758ab499034/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/davidtran/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-unsafe_2.12/3.1.1/1c3b07cb82e71d0519e5222a5ff38758ab499034/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)

Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform

WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Use --illegal-access=warn to enable warnings of further illegal reflective access operations

All illegal access operations will be denied in a future release

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Exception in thread "main" java.lang.NoSuchFieldError: JAVA_9
    at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
    at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:109)
    at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:371)
    at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2678)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:942)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:936)
    at main.main(main.java:6)
David Tran
  • 11
  • 2

1 Answers1

0

I was able to fix the following error by excluding org.apache.commons:commons-lang3 from org.apache.hadoop:hadoop-common pom.xml dependency list.

java.lang.NoSuchFieldError: JAVA_9

manuna
  • 729
  • 14
  • 37