14

I was trying to execute sample basic sparkstreaming example in Scala IDE, but I am getting below error:

Error: Could not find or load main class org.test.spark.streamExample.

Could anyone help me to sort out this please.

dds
  • 2,335
  • 1
  • 31
  • 45
kumar
  • 355
  • 2
  • 4
  • 9
  • Possible duplicate of [Scala project won't compile in Eclipse; "Could not find the main class."](http://stackoverflow.com/questions/3953468/scala-project-wont-compile-in-eclipse-could-not-find-the-main-class) – Ani Menon Sep 08 '16 at 09:42

10 Answers10

6

RightClick on your project and go to properties where you will find Scala Compiler, change target to jvm 1.7 based on your installation and also change Scala Installation dropdown based on version you have installed

sudhir
  • 1,387
  • 3
  • 25
  • 43
4

This Error May occur for two reasons:

1. When you did not write the main method in the scala program

def main(args: Array[String]): Unit = {
    println("TEst")
  }


2. When you add unsupported jar to the build path of the scala project

Please check the above two scenarios.

rajkumar chilukuri
  • 235
  • 1
  • 3
  • 8
2

If your scala file imported from external, check the very top of the code in Main file, just confirm that the package name matches yours.

Ted Corleone
  • 843
  • 1
  • 8
  • 16
1

The problem occurs mostly due to the incompatible version. Try to build spark with the version of Scala you are using.

Another solution to this problem is :

right click on the Project => Properties => Scala Compiler => Scala Installation => From the drop down select correct version of Scala.(Preferred a lower version of Scala. If not then install a lower version of Scala, then repeat the steps. )

Ananta Chandra Das
  • 1,865
  • 2
  • 14
  • 18
1

Error: Could not find or load main class this same problem i have faced for more than 1 days, i have reinstalled IDE(IntelliJ) and i have changed JDK 11 to JDK 8 but nothing was working but finally i resolved it by adding these below two dependency

Solution:

we have to add both dependency spark-core and spark-sql in build.sbt Paackage

1).

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0"

2).

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"

i have copied these both dependency from these links

https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.12/2.4.0

https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.12/2.4.0

jasraj
  • 51
  • 2
0

The main reason for such error is that some compilation errors may exists, it could be some any build path related issue or some code issue. Check the issue in problem tab .

Other than this , check Scala Library version in your POM and Scala Installation version in Scala compiler

Azam Khan
  • 516
  • 5
  • 12
0

If you are using IntelliJ IDE, then go to the drop down at right top of the IDE window and select "Edit Configuration".

In the left side of the pop-up screen, select "Scala-console", and in the right side of window, select your module name under "Use class path and SDK of module."

enter image description here

KayV
  • 12,987
  • 11
  • 98
  • 148
0

In my scenario :

Step 1 : Application is built from sbt

Step 2 : imported the application in eclipse.

Step 3 : When running the application got the following error

1

Solution

In the problems tab if we see the following error

2

sbt was using scala 2.11 and in eclipse we have scala set to 2.12. So if we set the eclipse scala compiler version to 2.11 it will work as shown below

3

Please check this possibility.

0
  • Select the "Scala Library Container[x.x.x]"
  • Right click and open properties
  • Change it to different (lower) version {I tried for 2.11.11}
  • Click apply and close.
slfan
  • 8,950
  • 115
  • 65
  • 78
F Usman
  • 31
  • 2
0

If you are from LINUX and using only a text editor and a terminal to use scalac and scala, make sure that you have set $SCALA_HOME and $PATH

export SCALA_HOME=/usr/local/share/scala
export PATH=$PATH:$SCALA_HOME/bin

Naga Sandeep
  • 1,421
  • 2
  • 13
  • 26