Once a SparkML model has been trained on a Spark cluster, how can I take the trained model and make it available for scoring through a restful API?
The problem is that it requires a SparkContext in order to be loaded, but is there a way to 'fake it' since it does not seem really necessary, or what is the minimum required to create a SparkContext?