0

I'm working with slightly big data and i need to write this data to an xlsx file. Sometimes the size of this files can be 15GB. I have a python code that gets data as dataframes and writes data to excel continuously so i need to write data to an existing excel and the existing sheet. I was using 'openpyxl'. There are two problems that I faced while working with that library.

  1. Firstly to append an existing excel it needs to load workbook which is an impossible thing for me because of the data size. I must use the lowest RAM I can use. -
  2. Secondly this lib is useful only writing to the different sheets. When I'm trying to write data to same sheet even if I give the 'startrow' for the saving process it deletes the old data and writes new one starting from that row.

I already tried the solution available here to address my problem but it doesn't fit my requirements.

Do you have any idea how I can do this?.

Jiraiya
  • 101
  • 1
  • 9
  • at the risk of sounding like a cliche SO answer, you should probably consider switching to a database and generating ondemand excel reports, the file sizes you're discussing just aren't manageable in the paradigm you're working under right now – vencaslac Sep 23 '20 at 11:17
  • I know this data is unmanageable with this view. The project will work with a database but due to the management issues right now i need to go with this solution. – Jiraiya Sep 23 '20 at 12:13

0 Answers0