Is there any way to query a does not exist column in a Spark SQL? I get a list of JSON by spark-streaming , and then , I want to transform this jsons as a temporary table .So I can use SQL to query for these. I get a tmp table below:
SparkSession ss = SparkSession.builder().config(sparkConf).getOrCreate();
Dataset<Row> rdd = ss.read().json(rdd);
rdd.registerTempTable("tmp_table");
and this table's struct is 'username,passwr,uid,kid' but when I use SQL to query:
ss.sql("select * from tmp_table where xxx=1");
"xxx" is a column which does not in this table's column.
How can I do for it ? I want to get none result when the column does not exist but no error like this:
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve 'xxx' given input columns
Because of I don't know which column will exist until I really use this data, I can not to specify fix column for it and I only get it SQL condition before.
Besides, I want to configure this SQL condition but not write it in my code. I just want to get a result for "none result " when column doesn't exist,or get a result for "none result or this column is null and so on " when column doesn't exist.
Thank you for your resolution.