We have tried wrapping the column name with brackets [column name]
, single & double quotes, and backticks, none of them works.
Does Spark SQL support columns whose name contains spaces?
We have tried wrapping the column name with brackets [column name]
, single & double quotes, and backticks, none of them works.
Does Spark SQL support columns whose name contains spaces?
Backticks seem to work just fine:
scala> val df = sc.parallelize(Seq(("a", 1))).toDF("foo bar", "x")
df: org.apache.spark.sql.DataFrame = [foo bar: string, x: int]
scala> df.registerTempTable("df")
scala> sqlContext.sql("""SELECT `foo bar` FROM df""").show
foo bar
a
Same as DataFrame
API:
scala> df.select($"foo bar").show
foo bar
a
So it looks like it is supported, although I doubt it is recommended.
Instead of using Brackets like in T-SQL [column name]
Use backticks to wrap the column name `column name`
. This is when you run SQL. You also use Backticks in spark SQL to wrap the column name but use triple quotes as answered by zero323.