I have two DataFrames (Spark 2.2.0 and Scala 2.11.8). The first DataFrame df1
has one column called col1
, and the second one df2
has also 1 column called col2
. The number of rows is equal in both DataFrames.
How can I merge these two columns into a new DataFrame?
I tried join
, but I think that there should be some other way to do it.
Also, I tried to apply withColumm
, but it does not compile.
val result = df1.withColumn(col("col2"), df2.col1)
UPDATE:
For example:
df1 =
col1
1
2
3
df2 =
col2
4
5
6
result =
col1 col2
1 4
2 5
3 6