I am having two dataframes in spark scala registered as tables. From these two tables
Table 1:
+-----+--------+
|id |values |
+-----+----- +
| 0 | v1 |
| 0 | v2 |
| 1 | v3 |
| 1 | v1 |
+-----+----- +
Table 2:
+-----+----+--- +----+
|id |v1 |v2 | v3
+-----+-------- +----+
| 0 | a1| b1| - |
| 1 | a2| - | c2 |
+-----+---------+----+
I want to generate a new table using the above two tables.
Table 3:
+-----+--------+--------+
|id |values | field |
+-----+--------+--------+
| 0 | v1 | a1 |
| 0 | v2 | b1 |
| 1 | v3 | c2 |
| 1 | v1 | a2 |
+-----+--------+--------+
Here v1 is of the form
v1: struct (nullable = true)
| |-- level1: string (nullable = true)
| |-- level2: string (nullable = true)
| |-- level3: string (nullable = true)
| |-- level4: string (nullable = true)
| |-- level5: string (nullable = true)
I am using spark sql in scala .
Is it possible to do the desired thing by writing some sql query or using some spark functions on dataframes.