0

I am trying to achieve this via pyspark building sql. The goal is to combine multiple rows into single row Example: I want to convert this

+-----+----+----+-----+
| col1|col2|col3| col4|
+-----+----+----+-----+
|x    |  y |  z |13::1|
|x    |  y |  z |10::2|
+-----+----+----+-----+

To

+-----+----+----+-----------+
| col1|col2|col3|       col4|
+-----+----+----+-----------+
|x    |  y |  z |13::1;10::2|
+-----+----+----+-----------+
pault
  • 41,343
  • 15
  • 107
  • 149
sks27
  • 63
  • 2
  • 7

2 Answers2

8

What you're looking for is the spark-sql version of this answer, which is the following:

query = """
  select col1, 
         col2, 
         col3, 
         concat_ws(';', collect_list(col4)) as col4 
    from some_table 
group by col1, 
         col2, 
         col3
"""
spark.sql(query).show()
#+----+----+----+-----------+
#|col1|col2|col3|       col4|
#+----+----+----+-----------+
#|   x|   y|   z|13::1;10::2|
#+----+----+----+-----------+

But be aware that since spark is distributed, this is not guaranteed to maintain any specific order, unless you explicitly specify the order.

See more:

pault
  • 41,343
  • 15
  • 107
  • 149
1

Expanding upon the suggestion made by @Barmar in a comment, you can run a SQL query like this:

SELECT col1, col2, col3, GROUP_CONCAT(col4)
FROM your_table
GROUP BY col1, col2, col3
Ike Walker
  • 64,401
  • 14
  • 110
  • 109
  • Thanks for your response Ike but I get the following exception Undefined function: 'GROUP_CONCAT'. This function is neither a registered temporary function nor a permanent function registered in the database 'default' – sks27 May 07 '19 at 16:19
  • GROUP_CONCAT() is a MySQL function. What database are you using? – Ike Walker May 07 '19 at 16:40