I need to write a python script that takes an existing file split into columns and rows, reads in each column, and outputs each column as a row in the output file. The file is a matrices of numbers, and the process is transposing and outputting that matrices. The issue is that my matrices file is so huge that I literally can not hold the entire thing in memory. Attempting to do so crashes with a memory error. Every solution I've found so far either requires you to grab the entire infile at once, read through every row of the infile over and over again just grabbing one number at a time, or reading through the infile once, but parsing through the outfile over and over again to append each row with the next number.
Example input:
1,2,3
4,5,6
7,8,9
example output:
1,4,7
2,5,8
3,6,9
Additional info: The file is in plaintext. The delimiter can be a comma, a space, or a tab depending. The matrices is not square.
Edit: Final solution.
Unfortunately, it seems the task I wished to do could not be done the way I wanted. Due to tight memory constraints, there wasn't much that could be done outside of either looping through the infile or out file multiple times. So the final solution is to just read the infile over and over to construct each outfile row, outputting the row, and repeating for the next one.