I have 180 000 pandas Series that I would need to combine into one DataFrame. Adding them one by one takes a lot of time, apparently because appending gets increasingly slower when the size of the dataframe increases. The same problem persists even if I use numpy which is faster than Pandas in this.
What could be an even better way to create a DataFrame from the Series?
Edit: Some more background info. The Series were stored in a list. It is sports data, and the list was called player_library with 180 000 + items. I didn't realise that it is enough to write just
pd.concat(player_library, axis=1)
instead of listing all the individual items. Now it works fast and nicely.