I am quite new to Scala (and Spark, if this is somehow Spark-specific), so please forgive the super simple question.
To me, it seems like this code should compile just fine:
sqlContext.udf.register("json_extract_string", (rawJson: String, keyPath: String*) => {
[String]UDFs.jsonExtract(rawJson, keyPath:_*)
})
Yet compiling gives the error:
Error:(31, 89) ')' expected but identifier found.
sqlContext.udf.register("json_extract_string", (rawJson: String, keyPath: String*) => {
^
Why is this?
The function being called looks like this:
object UDFs {
def jsonExtract[T: Manifest](rawJson: String, keyPath: String*): Option[T] = {
implicit val formats = DefaultFormats
val json = parse(rawJson)
keyPath.foldLeft(json)(_ \ _).extractOpt[T]
}
}