I am new to Apache Spark and testing my first program.
It is a 2-3 lines program just for testing purposes.
I am using Eclipse and compiled the java file with Maven.
I am trying to run the spark-submit but getting this error.
I do not think it is from the file name or the path.
Could it be from another issue?
...spark-2.1.0-bin-hadoop2.7\bin>spark-submit --class "Main" --master local[4] "C:\Users\...\target\SparkTest-0.0.1-SNAPSHOT.jar"
The filename, directory name, or volume label syntax is incorrect.
This is the main class
import java.util.Arrays;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.apache.spark.api.java.function.PairFunction;
public class SparkMain {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setMaster("local").setAppName("My App");
JavaSparkContext sc = new JavaSparkContext(conf);
System.out.println("HELLO");
JavaRDD<String> lines = sc.textFile("C:/spark/spark-2.1.0-bin-hadoop2.7/README.md");
System.out.println(lines.count());
}
}