0

I use spark using java, whenever I try to run my code it appears an IOException in this lines of code :

     SparkConf conf = new SparkConf().setAppName("myapp").setMaster("local[*]");
     JavaSparkContext sc = new JavaSparkContext(conf);

the detail of this exception is :

Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

and when i tried to download winutils.exe and installed it i got this message :

the program cant start bcz MSVCR100.dll is misssing

so how can i solve it please?

Marcus D
  • 1,074
  • 1
  • 12
  • 27
hammadspark
  • 87
  • 1
  • 2
  • 13
  • Related questions: http://stackoverflow.com/questions/34697744/spark-1-6-failed-to-locate-the-winutils-binary-in-the-hadoop-binary-path and http://stackoverflow.com/questions/14557245/wamp-shows-error-msvcr100-dll-is-missing-when-install – Binary Nerd Jun 08 '16 at 10:56

2 Answers2

1

You need to set HADOOP_HOME variable to some path. Place winutils.exe inside %HADOOP_HOME%\bin.

For the MSVCR100.dll, download it and install.

Make sure, that you download 32 bit/64 bit winutils.exe and dll as per your machine configuration.

Here is the link to setup spark on windows: How to run Apache Spark on Windows7 in standalone mode

Hope it helps you..

Community
  • 1
  • 1
Nishu Tayal
  • 20,106
  • 8
  • 49
  • 101
  • i set the variable and i installed .dll file , honestly i dont know how to do this correctly, would u give me a link bcz i followed what i found in youtube but it didnt work – hammadspark Jun 09 '16 at 14:54
0

finally i fixed this issue , i just followed the way explained in this link , it s easy and clear and efficient link: http://teknosrc.com/spark-error-java-io-ioexception-could-not-locate-executable-null-bin-winutils-exe-hadoop-binaries/ wanna thank you #Nishu Tayal

hammadspark
  • 87
  • 1
  • 2
  • 13