-3

I have a script file in which i am defining some date variable and in the same file i am invoking Scala code using spark-shell command.

The variables defined in the shell file is being used as a date values (filters) in my Scala code (Spark.sql)

However i am getting error from Scala code that value does not found. I tried but this issue still persists.

Could you please help me in this?

Thanks and Regards,

Vimarsh

abc
  • 117
  • 1
  • 1
  • 7

1 Answers1

0

First you need to pass the file path in spark as an argument and then parse the file using import java.util.Properties.

Then you can get the value of a key and save it in a scala variable.

Then you can use it in your spark.sql as spark.sql(s"select * from table where d='${variable_name}'")

Rony
  • 196
  • 2
  • 15