When I use Spark in Scala, I found three ways to do same thing (?)
.select(to_avro(struct("*"), toAvroConfig).alias("value"))
.select(to_avro(struct("*"), toAvroConfig).as("value"))
.select(to_avro(struct("*"), toAvroConfig) as 'value)
I found documents about first two:
alias
doc:
def alias(alias: String): Column = name(alias)
/**
* Gives the column an alias.
* {{{
* // Renames colA to colB in select output.
* df.select($"colA".as("colB"))
* }}}
*
* If the current column has metadata associated with it, this metadata will be propagated
* to the new column. If this not desired, use the API `as(alias: String, metadata: Metadata)`
* with explicit metadata.
*
* @group expr_ops
* @since 1.3.0
*/
as
doc:
def as(alias: String): Column = name(alias)
/**
* (Scala-specific) Assigns the given aliases to the results of a table generating function.
* {{{
* // Renames colA to colB in select output.
* df.select(explode($"myMap").as("key" :: "value" :: Nil))
* }}}
*
* @group expr_ops
* @since 1.4.0
*/
For last one as 'value
, I found it here. Initially I thought it is a typo, but it is actually valid. How does it work? Is there a document about it?
Are all three same? Thanks!