1

I'm trying to implement org.apache.spark.sql.Row. Interface have default implementations for several methods, and IntelliJ doesn't complain about not overriding these methods. However, when building with maven, I get:

FunctionalRow is not abstract and does not override abstract method mkString(java.lang.String,java.lang.String,java.lang.String) in org.apache.spark.sql.Row

Below is the class implementation:

import java.util.List;
import java.util.function.Supplier;

import org.apache.spark.sql.Row;

import scala.collection.JavaConverters;
import scala.collection.Seq;

public class FunctionalRow implements Row {
    protected List<Supplier<Object>> suppliers;

    public FunctionalRow(List<Supplier<Object>> suppliers) {
        this.suppliers = suppliers;
    }

    @Override
    public int length() {
        return suppliers.size();
    }

    @Override
    public Object get(int i) {
        return suppliers.get(i).get();
    }

    @Override
    public Row copy() {
        return this;
    }

    @Override
    public Seq<Object> toSeq() {
        return JavaConverters.asScalaIteratorConverter(suppliers.stream().map(s -> s.get()).iterator()).asScala().toSeq();
    }
}

maven-compiler-plugin settings:

        <plugin>
          <artifactId>maven-compiler-plugin</artifactId>
          <version>3.8.0</version>
          <executions>
            <execution>
              <id>default-compile</id>
              <phase>compile</phase>
              <goals>
                <goal>compile</goal>
              </goals>
              <configuration>
                <source>1.8</source>
                <target>1.8</target>
                <encoding>UTF-8</encoding>
              </configuration>
            </execution>
            <execution>
              <id>default-testCompile</id>
              <phase>test-compile</phase>
              <goals>
                <goal>testCompile</goal>
              </goals>
              <configuration>
                <source>1.8</source>
                <target>1.8</target>
                <encoding>UTF-8</encoding>
              </configuration>
            </execution>
          </executions>
          <configuration>
            <source>1.8</source>
            <target>1.8</target>
            <encoding>UTF-8</encoding>
          </configuration>
        </plugin>

Any help would be appreciated!

Lior Chaga
  • 1,424
  • 2
  • 21
  • 35
  • 1
    Perhaps it can't be done? https://stackoverflow.com/a/7637888/2204206 – Lior Chaga Oct 03 '19 at 07:32
  • 1
    Was the Spark version you're using compiled using Scala 2.12 or later? With Scala 2.12, traits compile straight to Java interfaces. This is because Java 8 introduced default implementations in interfaces, so Scala 2.12 (which requires Java 8+), can translate to interfaces without the complex mechanics cited in the answer you linked to. I'm no Java-interop expert, but Java should see the default implementation of `mkString` if it's indeed there. – francoisr Oct 03 '19 at 08:27
  • well, unfortunately I'm using spark 2.2. So 2.12 is not an option... Upgrading spark would be totally out of scope of my task so I'll have to live with this limitation for now. But good to know, thanks! – Lior Chaga Oct 03 '19 at 09:15

1 Answers1

0

I'm making my comment into an answer so you can mark this question as answered.

TL;DR: You cannot use the default implementation on Scala traits in Java if the trait has been compiled by a Scala version earlier than 2.12. This is the case for the Spark version used here, so all hope is lost.

The reason has to do with how the Scala compiler encodes traits to be compatible with the JVM (and therefore with Java).

Before Java 8, interfaces were not allowed to provide default implementations. Scala traits, which are basically stackable interfaces with default implementations, has to be encoded has both an interface with no implementation, and an abstract class that provided the implementation. For details about this, see this answer.

After Java 8, interfaces were allowed to provide a default implementation, so Scala could encode traits with default implementations straight to Java interfaces and be directly compatible with Java code. Since Scala 2.12, Scala requires a Java 8+ compatible JVM, and therefore brings trait<->interface compatibility. If Scala code has been compiled by a Scala 2.12+ compiler, you can use its traits as normal Java interfaces.

francoisr
  • 4,407
  • 1
  • 28
  • 48