I want to use already existing C++ code in Apache Spark which runs in Java. I used SWIG to generate the JNI interface which works like a charm as long as I am calling the functions from a simple class like this (MyJavaClass.java):
public class MyJavaClass {
public static void main(String[] args) {
System.loadLibrary("mycppcode");
mycppcode.dostuff();
}
}
Then I tried to integrate the class into my packaging hierarchie which I need to be able to call it from Spark (MyJavaClass.java):
package com.mycompany.mypackage;
public class MyJavaClass {
public void MyJavaMethod() {
System.loadLibrary("mycppcode");
mycppcode.dostuff();
}
}
But as soon as I add the package line I am getting the following error when running javac on the file:
MyJavaClass.java:6: error: cannot find symbol
mycppcode.dostuff();
^
symbol: variable mycppcode
location: class MyJavaClass
1 error
I used the following commands to generate the JNI, the libmycppcode.so library and the Java class:
swig -c++ -java -package com.mycompany.mypackage mycppcode.i
g++ -fpic -c mycppcode.cpp mycppcode_wrap.cxx -std=c++17 -I/usr/lib/jvm/java-8-openjdk-amd64/include -I/usr/lib/jvm/java-8-openjdk-amd64/include/linux
g++ -shared mycppcode.o mycppcode.o -o libmycppcode.so
javac MyJavaClass.java
This is the C++ file (mycppcode.cpp):
#include <iostream>
using namespace std;
int dostuff() {
cout << "Hello World!" << endl;
}
This is the interface file (mycppcode.i):
%module mycppcode
%{
extern int dostuff();
%}
extern int dostuff();
This is the generated JNI file (mycppcodeJNI.java):
package com.mycompany.mypackage
public class mycppcodeJNI {
public final static native int dostuff();
}
This is the generated Java file (mycppcode.java):
public class mycppcode {
public static int dostuff() {
return mycppcodeJNI.dostuff();
}
}