6

I'm using scala 2.10 and gradle 1.11

My problem is that the compiled jar drop an error when I try to running in the hadoop cluster. I want to run on hadoop because I using scalding.

The exception is:

Exception in thread "main" java.io.FileNotFoundException:
/tmp/hadoop-root/hadoop-unjar6538587701808097105/com/twitter/bijec
tion/GeneratedTupleCollectionInjections$$anon$31$$anonfun$invert$10$$anon
fun$apply$46$$anonfun$apply$47$$anonfun$apply$48$$anonfun$apply$49$$anonfu
n$apply$50$$anonfun$apply$51$$anonfun$apply$52$$anonfun$apply$53$$anonfun$app
ly$54$$anonfun$apply$55.class (File name too long)

Any comments are welcome...

gioele
  • 9,748
  • 5
  • 55
  • 80

1 Answers1

7

Adding -Xmax-classfile-name 200 to the scalac options should fix that.

Sources:

https://issues.scala-lang.org/browse/SI-3623

https://groups.google.com/forum/#!topic/simple-build-tool/wtD6vgdiy6g

My other car is a cadr
  • 1,429
  • 1
  • 13
  • 21
  • 3
    but hasn't the problem been because of `scalding` internal class which he has no control while compiling his own sources? after all the error was about scalding class: `com/twitter/bijec tion/GeneratedTupleCollectionInjections...` – Jas Jan 08 '15 at 15:22
  • Agree with what @Jas said. The scalac options just works for your own classes having shorter file names. – maxjakob Nov 10 '15 at 20:13
  • To expand on this a bit, you want to _reduce_ the max-classfile-name value so that you can sneak under the hard 255 full file path size limit in Linux. – Azuaron Jun 01 '17 at 14:20