0

I am using Apache Spark in my Scala application. The problem is that the application is using all CPU cores permanently(with avg of %95). I even tried to set spark.cores.max to 1 but it did not seem to help. My application is running in local (has 4 CPU cores). What might be the problem? Here are the versions and the code I used for creating Spark Configuration.

  val sparkConf = new SparkConf()
   .setAppName("MyApp")
   .setMaster("local[4]")
   .set("spark.cores.max", "1")

scala version: 2.11.12
spark version: 2.3.0

Onur Arı
  • 502
  • 1
  • 4
  • 16

0 Answers0