I use spark using java, whenever I try to run my code it appears an IOException in this lines of code :
SparkConf conf = new SparkConf().setAppName("myapp").setMaster("local[*]");
JavaSparkContext sc = new JavaSparkContext(conf);
the detail of this exception is :
Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
and when i tried to download winutils.exe and installed it i got this message :
the program cant start bcz MSVCR100.dll is misssing
so how can i solve it please?