I have added a custom value to conf/spark-defaults.conf but that value is not being used.
stephen@ubuntu:~/spark-1.2.2$ cat conf/spark-defaults.conf
spark.akka.frameSize 92345678
Now let us run my program LBFGSRunner
sbt/sbt '; project mllib; runMain org.apache.spark.mllib.optimization.LBFGSRunner spark://ubuntu:7077'
Notice the following error: the conf setting was not being used:
[error] Exception in thread "main" org.apache.spark.SparkException:
Job aborted due to stage failure: Serialized task 0:0 was 26128706 bytes,
which exceeds max allowed: spark.akka.frameSize (10485760 bytes) -
reserved (204800 bytes). Consider increasing spark.akka.frameSize
or using broadcast variables for large values