0

Currently, I have build a simple Spark application, with code below:

SparkConf conf = new SparkConf().setAppName("Test").setMaster("local[2]")
JavaSparkContext sc = new JavaSparkContext(conf);
String path = "file:///C:/Users/Me/Desktop/demo/1530842877616/";

JavaPairRDD<String, String> rdd = sc.wholeTextFiles(path);

JavaPairRDD<String, String> transformed = rdd.mapToPair(tuple2 ->{
    String fname = tuple2._1();
    String content = tuple2._2();
    content = YUVSimpleTrans.transform(content);
    return new Tuple2<>(fname, content);
});

System.out.println("saveAsTextFile....");
transformed.saveAsTextFile("file:///C:/Users/Me/Desktop/demo/1530842877616/out");

It's very simple that, load a JavaPairRDD from a path. There are many ".yuv" files in the path

I dont't use "spark-submit" but run with spring-boot env. Everything goes well, but, the following exception was thrown when "saveAsTextFile" job trigger out.

java.lang.RuntimeException: Error while running command to get file permissions : java.io.IOException: (null) entry in command string: null ls -F C:\Users\Me\Desktop\demo\1530842877616\split_0.yuv
     at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:762)
     at org.apache.hadoop.util.Shell.execCommand(Shell.java:859)
     at org.apache.hadoop.util.Shell.execCommand(Shell.java:842)

I really confused why it throw this exception. I have passed an exactly right path that start with "file:///", and it can resolve this is a windows path. I don't know why it use a linux command "ls -F" at this time?

Any sugguestion? Or I missed any important infomation?

W.X
  • 175
  • 2
  • 13

0 Answers0