1

My situation is as follows: I would like to submit to Spark a 32-bit driver program. The driver program is in C# (.NET Framework console app). I use Mobius (https://github.com/Microsoft/Mobius/releases) to enable Spark fo C#.

the app is rather simple and features a very simple testing code:

public static void Main(string[] args)
    {

            var conf = new SparkConf(); 
            var sparkContext = new SparkContext(conf);

            var rdd = sparkContext.Parallelize(new List<string>() { "a", "b", "c", "d" });

            var response = rdd.Map(s => s).Collect();

    }

After I properly submit this app to sparkclr (that is, to mobius which then passes it to Spark itself), I get the following exception:

System.Exception: JVM method execution failed: Static method collectAndServe failed for class org.apache.spark.api.python.PythonRDD when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=11], )
   at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStatic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters)
   at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallStaticJavaMethod(String className, String methodName, Object[] parameters)
   at Microsoft.Spark.CSharp.Proxy.Ipc.RDDIpcProxy.CollectAndServe()
   at Microsoft.Spark.CSharp.Core.RDD`1.Collect()

The exception goes away when I build the app in x64. Also, the exception is not there when the app is built in x86 but does nothing (i.e. has no

var rdd = sparkContext.Parallelize(new List<string>() { "a", "b", "c", "d" });

var response = rdd.Map(s => s).Collect();

lines).

Is there any workaround to submit 32bit app to Spark?

Could this be because I have 64bit version of Java? (what I downloaded and installed was jdk-8u161-windows-x64.exe).

C_M
  • 61
  • 8
  • Almost certainly. *Why* don't you want to compile for x64? – Panagiotis Kanavos Feb 20 '18 at 13:30
  • @PanagiotisKanavos In reality I'd like to submit to Spark a 32-bit application because it should use a 32-bit dll. – C_M Feb 22 '18 at 09:28
  • My apologies for the late non-reply. I post this here to help others finding a supported way for use .NET with Apache Spark. Microsoft just released a dataframe based .NET support for Apache Spark via the .NET Foundation OSS. See http://dot.net/spark for more details. – Michael Rys Jun 01 '19 at 01:08

0 Answers0