0

I am trying to develop an application with Spark and getting following errors while executing on myeclispe.

I have added all dependency jars into the application after that also I am getting this errors.

ERROR Shell: Failed to locate the winutils binary in the Hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/network/client/StreamCallback Caused by: java.lang.ClassNotFoundException: org.apache.spark.network.client.StreamCallback

Please help me on this and suggest to me how to overcome from these issues.

Community
  • 1
  • 1
  • 1
    There are multiple answers on SO regarding your winutils.exe problem, for example: http://stackoverflow.com/questions/34697744/spark-1-6-failed-to-locate-the-winutils-binary-in-the-hadoop-binary-path – Binary Nerd Jun 16 '16 at 07:22
  • @BinaryNerd , In my system, I already configured winutils.exe and it's working fine on Scala and basic java programs. While trying to integrating with Spark and Java2EE application. – Eswar Kumar Jun 16 '16 at 10:04
  • Neither Java or Scala specifically need winutils, Spark/Hadoop need it. The `null` in the path tells me you're not setting `hadoop.home.dir`. – Binary Nerd Jun 16 '16 at 16:20
  • Thanks @BinaryNerd for you help and it is working now after tried with winutils.. – Eswar Kumar Jun 17 '16 at 09:42

0 Answers0