My situation is as follows: I would like to submit to Spark a 32-bit driver program. The driver program is in C# (.NET Framework console app). I use Mobius (https://github.com/Microsoft/Mobius/releases) to enable Spark fo C#.
the app is rather simple and features a very simple testing code:
public static void Main(string[] args)
{
var conf = new SparkConf();
var sparkContext = new SparkContext(conf);
var rdd = sparkContext.Parallelize(new List<string>() { "a", "b", "c", "d" });
var response = rdd.Map(s => s).Collect();
}
After I properly submit this app to sparkclr (that is, to mobius which then passes it to Spark itself), I get the following exception:
System.Exception: JVM method execution failed: Static method collectAndServe failed for class org.apache.spark.api.python.PythonRDD when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=11], )
at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStatic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters)
at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallStaticJavaMethod(String className, String methodName, Object[] parameters)
at Microsoft.Spark.CSharp.Proxy.Ipc.RDDIpcProxy.CollectAndServe()
at Microsoft.Spark.CSharp.Core.RDD`1.Collect()
The exception goes away when I build the app in x64. Also, the exception is not there when the app is built in x86 but does nothing (i.e. has no
var rdd = sparkContext.Parallelize(new List<string>() { "a", "b", "c", "d" });
var response = rdd.Map(s => s).Collect();
lines).
Is there any workaround to submit 32bit app to Spark?
Could this be because I have 64bit version of Java? (what I downloaded and installed was jdk-8u161-windows-x64.exe).