2

I noticed that when I start my Spark EC2 cluster from my local machine with spark/ec2/spark-ec2 start mycluster the setup routine has a nasty habit of destroying everything I put in my cluster's spark/conf/. Short of having to run a put-my-configs-back.sh script every time I start up my cluster, is there a "correct" way to set up persistent configurations that will survive a stop/start? Or just a better way?

I'm working off of Spark master locally and Spark 1.2 in my cluster.

zero323
  • 322,348
  • 103
  • 959
  • 935
Noah
  • 196
  • 10
  • This *may* be a bug in `spark-ec2`. It looks like the script calls [the same internal setup method when starting an existing cluster](https://github.com/apache/spark/blob/v1.2.0/ec2/spark_ec2.py#L1065) as it does when launching a new cluster, so perhaps that's why that config file is getting blown away. – Nick Chammas Dec 27 '14 at 18:58
  • Just filed under [issue 4977](https://issues.apache.org/jira/browse/SPARK-4977) – Noah Dec 27 '14 at 19:36

0 Answers0