2

I'm using the code found here - How to export a CSV to Excel using Powershell

However, when I try to convert a file with say 302,123 rows (and 48 columns), it only converts up to 65536 rows. Is there a way to convert the whole file? I read it's an excel 2003 issue. I'm not sure how to get it to convert over with 2007 and up only...

Community
  • 1
  • 1
Fiddle Freak
  • 1,923
  • 5
  • 43
  • 83
  • A CSV file has no limit on the maximum number of rows. The only problem you have is when importing to Excel, which does have a limit on the upper number of rows. What specifically are you trying to do? – Bill_Stewart Jan 27 '16 at 16:49
  • 1
    Convert the csv file into a .xlsx file due to the less memory storage used. The file with about 1000 rows is 936kb. Converting it to .xlsx makes it 323kb. The file with 302,123 rows is about 356mb. How can I convert it into a .xlsx and keep all the data? If I try to open the csv file with the excel gui, it gives an error saying data is too large... – Fiddle Freak Jan 27 '16 at 16:53
  • 1
    Converting a CSV with hundreds of thousands of rows and hundreds of MB in size to an Excel workbook is insane. Keep the data as a CSV. Compress it into a zip archive if you're pressed for space (recent Excel versions technically do the same thing). Or import the CSV into an actual database. – Ansgar Wiechers Jan 27 '16 at 17:01
  • Wikipedia says this: "65536 is the maximum number of spreadsheet rows supported by Excel 97, Excel 2000, Excel 2002 and Excel 2003 (Excel 2007, 2010 and 2013 support 1,048,576 rows (2^20)). Text files that are larger than 65536 rows cannot be imported to these versions of Excel." So you'd need at least Excel 2007 to go beyond 64K rows. As with all SO questions you should google first, then ask questions if after a fair amount of searching you don't get a answer. – Χpẘ Jan 27 '16 at 17:06
  • @AnsgarWiechers yes, it is insane, but it's what the customer is asking for. As of right now the only way to view the file is notepad. However what if I want to view it using excel? Or is there another editor viewer (other than a database) I can use for this? At user2460798: yes, and we are using excel 2007. And I did spend about an hour of searching for a way before asking. – Fiddle Freak Jan 27 '16 at 17:06
  • just to be clear, if I try to open the `.csv file` (containing `302,123 rows` and `48 columns`, `356mb`) using `excel 2007` I get an error message `Excel cannot complete this task with available resources. Choose less data or close other applications`. It does however after clicking okay on the error open partial amount of the file (up to row 153689). – Fiddle Freak Jan 27 '16 at 17:12
  • Seems like the response in the error message is clear. "Close other applications". – Χpẘ Jan 27 '16 at 17:17
  • @user2460798 okay, so you are saying I need more memory than 8gb of ram (all other applications are closed). – Fiddle Freak Jan 27 '16 at 17:18
  • I would look at Taskmgr's perf tab while you are importing the data into Excel. If the memory graph gets close to 100% then I'd say yes. Another possibility is to try creating several smaller spreadsheets and then use Excel copy'n'paste to merge them. – Χpẘ Jan 27 '16 at 17:22
  • @user2460798 hmmm, it might be a good idea to split the file into seperate csv files. `part1.csv` `part2.csv` and so forth containing no more than 50,000 rows. Then convert those to xlsx files. – Fiddle Freak Jan 27 '16 at 17:35
  • 3
    Sometimes the customer has to be informed that what they are requesting cannot be done in a certain way due to technology constraints. – Bill_Stewart Jan 27 '16 at 17:58
  • @FiddleFreak, splitting up the CSV is what I'd recommend. Excel giving you memory issues isn't all about how much RAM your system has, it is how much Excel can use. I don't think Office 2007 came in a 64-bit flavor, so Excel can probably only use between 2GB and 4GB. Even if you loaded 64 more gigs of RAM onto the computer, Excel wouldn't be able to utilize it. – romellem Jan 27 '16 at 21:20
  • To add to what @Bill_Stewart said, it'd be the same if the client asked for a 100,000 × 100,000 pixel JPEG image. The file format only supports up to 65,535 × 65,535 pixels ([source](https://en.wikipedia.org/wiki/JPEG)), so even if the client requests it, in this case, it is not possible. – romellem Jan 27 '16 at 21:24

1 Answers1

1

It sounds like you are reaching the memory limit (theoretical 2GB virtual memory limit) for Excel 2007 which is an 32bit process.

Filesize is not the same as memory usage as Excel tries to convert the data into rows. Ex. I just converted a 650k lines CSV-file (15MB) using the linked answer in your post and my Excel 2013 64-bit reached at least 70-80MB memory before it was done. Imagine your 350+ MB CSV-file.

If you really need to convert that many rows to an excel document you should get a computer with 64-bit Office 2010 or newer.

Personally, I would recommend a SQL-database for this amount of data.

Frode F.
  • 52,376
  • 9
  • 98
  • 114
  • Thanks, I just ended up with the solution of splitting the file into separate files (Since upgrading or using other tools is not an option). I also recommended this to the customer, but getting anything (even free and stamped with Microsoft) at this company is treated like a big NO due to fear of security issues, costs, and the time to learn something new. Hint, I can't even use notepad++, nor do I have admin rights to install it myself. – Fiddle Freak Jan 27 '16 at 23:29