1

this seems a bit silly but I can't figure out what is happening in my intelliJ project. The spark dependencies seem not to be set well. I searched a lot in here without finding the right answer.

In build.sbt:

libraryDependencies += "junit" % "junit" % "4.10" % Test
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0", 
  "org.apache.spark" %% "spark-sql" % "2.1.0"

I get the message when hovering, "Unknown Artifact. Not Resolved or Indexed". I found an article in here ("Unknown artifact. Not resolved or indexed" error for scalatest) talking about updating the revolvers but I couldn't find the table shown in the link. I tried to refresh the project without any improvement. I tried to add the dependency automatically by a right-click on import org.apache.spark.sparkconfig, but the response is that it can't find the library. However, I can see the library in the project architecture (root build).

in the class.scala:

import org.apache.spark.SparkConf // sparkConf in red
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.rdd.RDD

Any advice? It is blocking me. I appreciate any help... :)

Oliver
  • 11
  • 1

1 Answers1

0

The “Unknown artifact. Not resolved or indexed” inspection in itself is not an error. If you refresh the project and it finishes correctly, it is safe to ignore or disable the message. It is only meant to be informative.

Justin Kaeser
  • 5,868
  • 27
  • 46
  • I updated the resolvers from modules settings then I re-imported the project (because even File -> Invalidate Caches/Restart didn't work) . Now Spark libraries are linked. Thank you for the reply. – Oliver Aug 14 '18 at 09:31