8

Running spark-shell locally + define classpath to some 3rd party JARs:

$ spark-shell --driver-class-path /Myproject/LIB/*

Within the shell, I typed

scala> import com.google.common.collect.Lists
<console>:19: error: object collect is not a member of package com.google.common
   import com.google.common.collect.Lists
                            ^

I suppose Spark has loaded first /usr/local/spark-1.4.0-bin-hadoop2.6/lib/spark-assembly-1.4.0-hadoop2.6.0.jar which doesn't contain the com.google.common.collect package.

/Myproject/LIB/ contains google-collections-1.0.jar and has the com.google.common.collect. However, this jar seems to be ignored.

Question: How to tell spark-shell to load the JARs in --driver-class-path before those in spark-1.4.0-bin-hadoop2.6/lib/ ?

ANSWER: combining hints from Sathish and Holden comments
--jars must be used instead of --driver-class-path. All jar files must be specified. The jars must be comma-delimited, no space (as per spark-shell --help)

$ spark-shell --jars $(echo ./Myproject/LIB/*.jar | tr ' ' ',')
Polymerase
  • 6,311
  • 11
  • 47
  • 65
  • 1
    Try using the --jar option as follows, spark-shell --jar /Myproject/LIB/google-collections-1.0.jar – Sathish Jun 13 '15 at 04:38
  • @Sathish --jars is indeed the solution but it's quite long to type as all jars must be specified. Thanks to Holden trick below which allows to shorten the syntax. Initial questions has been edited to include the answer. Thx. – Polymerase Jun 13 '15 at 15:21

1 Answers1

4

The driver class path flag needs to be comma separated. So ,based on Setting multiple jars in java classpath , we can try spark-shell --driver-class-path $(echo ./Myproject/LIB/*.jar | tr ' ' ',')

Community
  • 1
  • 1
Holden
  • 7,392
  • 1
  • 27
  • 33
  • 1
    Thanks for the tricks using `$(echo ./Myproject/LIB/*.jar | tr ' ' ':')` The separator must actually be a comma. I have edited my initial question to include the answer. – Polymerase Jun 13 '15 at 15:18