0

So I'm trying to intertwine about 3 dataframes and the result should look like this:

df1

A

D

G


df2

B

E

H


df3

C

F

I


Resulting df:

A

B

C

D

E

F

G

H

I

I tried:

for i in len(df1+df2+df3):
    final_df.append(i)

I want to do this as efficiently as possible and with n dataframes

pissall
  • 7,109
  • 2
  • 25
  • 45
Elizabeth McBeth
  • 179
  • 1
  • 1
  • 7

2 Answers2

4

Referring to Spark unionAll multiple dataframes:

You can simply put all the data frames into a list, and do a unionAll on them, like so:

from functools import reduce
from pyspark.sql import DataFrame

dfs = [df1,df2,df3]
df = reduce(DataFrame.unionAll, dfs)
subhaoi
  • 45
  • 4
0

Use pd.concat:

pd.concat([df1, df2, df3], ignore_index=True)

You can concat as many dataframes as you want.

Code Different
  • 90,614
  • 16
  • 144
  • 163