1

I am trying to connect with MongoDB via java spark connector and I am getting an error "com.mongodb.spark.config.writeconfig", when I submit the jar and run the jar in spark shell. Here the error screenshot: enter image description here

Could you please help me to resolve this issue. I have tried this as well, but no success.

  • $./bin/sparkR --conf "spark.mongodb.input.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" ./bin/sparkR --conf "spark.mongodb.output.uri=mongodb://127.0.0.1/db.test" ./bin/spark-shell --packages org.mongodb.spark:mongo-spark-connector_2.11:2.2.0

  • $spark-submit --master local --class com.test.spark.SparkClient /home/otalogin/SparkClient.jar

  • $spark-submit --master local --class com.test.spark.SparkClient /home/otalogin/SparkClient.jar --jar mongo-spark-connector_2.11:2.2.0

but getting same error.

Please help me out of this issue.

Vinay Mishra
  • 386
  • 2
  • 15

1 Answers1

0

As suggested by Darshan M, you need to provide mongo dependencies.

The easiest way is to build a fat jar using maven / sbt.

If you use maven, configure your pom.xml, this can help you. If you use sbt, that can help you.

Bameza
  • 207
  • 2
  • 16