0

I'm modifying hdfs module inside hadoop, and would like too see the reflection while i'm running spark on top of it, but I still see the native hadoop behaviour. I've checked and saw Spark is building a really fat jar file, which contains all hadoop classes (using hadoop profile defined in maven), and deploy it over all workers. I also tried bigtop-dist, to exclude hadoop classes but see no effect.

Is it possible to do such a thing easily, for example by small modifications inside the maven file?

saman
  • 199
  • 4
  • 17

1 Answers1

0

I believe you are looking for the provided scope on maven artifacts. It allows you to exclude certain classes in packaging while allowing you to compile against them (with the expectation that your runtime environment will provide them at their correct respective versions). See here and here for further discussion.

Community
  • 1
  • 1
Rohan Aletty
  • 2,432
  • 1
  • 14
  • 20