0

I am trying to merge two large dataframes (1 column, but a lot of rows in each), but I get a memory error when doing so. My attempt so far was to only merge the first 10,000 rows which works but is obviously not a viable solution:

 df_merged = pd.merge(pd.DataFrame(df1.iloc[0:10000]),pd.DataFrame(df2.iloc[0:10000]),on = ['tickers','level_1'])

Can anyone suggest a solution to this problem?

Tartaglia
  • 949
  • 14
  • 20
  • 2
    They have the same index? If it's just one column, why not just `df1['col_2'] = df2['col]` – rafaelc Jul 22 '19 at 23:38
  • 2
    Possible duplicate of [Best way to join two large datasets in Pandas](https://stackoverflow.com/questions/37756991/best-way-to-join-two-large-datasets-in-pandas) – Erfan Jul 23 '19 at 00:13

0 Answers0