1

I have spark 1.6.0 installed and would like to import it in Scala 2.11. I can use spark-shell, which has org.apache in its path. How do I put it in my system installation of scala's path?

Daniel Gibson
  • 924
  • 8
  • 8
  • 1
    Please don't use [tag:apache] for questions not related to Apache HTTP Server :) It could be also useful if you explain what you mean by _installed_. By default Spark binaries in < 2.0 are built with Scala 2.10. – zero323 Mar 27 '16 at 19:06
  • Sorry about that. 'Installed' is a good question. It means there is a precompiled binary in a folder. – Daniel Gibson Mar 27 '16 at 20:45
  • Sorry, what do you mean by keeping org.apache in scala.? Do you want to execute spark-shell or you want to import any spark package in spark-shell.? – Srini Mar 27 '16 at 22:05
  • I want to import spark libraries in the regular scala shell. For example: import org.apache.spark – Daniel Gibson Mar 27 '16 at 22:10
  • 1
    this may be duplicate of this http://stackoverflow.com/questions/18812399/how-to-use-third-party-libraries-with-scala-repl You would need to get the desired libraries from maven and then use it with above instructions of using cp option http://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 – charles gomes Mar 28 '16 at 04:16

0 Answers0