I'm working on a database application that uses excel as its interface. Essentially, it's an excel macro that connects to a database through a .NET library.
The main problem is that the clients need to be able to manage huge amounts of data (5000-20000 lines, and about 160 columns). I've managed to minimize the delays on both the database access, and by loading the whole chunk of data as strings directly into the spreadsheet range .Value property (formatted as a string[,]). It was fast, easy and efficient.
But now the client needs support for excel formats. As, in, I need the excel to recognize dates as such, numbers as such, and so on. This, while supporting non-uniform datatypes: a date column might contain a date format, but might as well include text or something else.
I've tried several approaches: -Keep on loading the strings as such, preformatting the column according to its expected datetype (ws.range(range.EntireColumn.Address).NumberFormat = "0", for example). It didn't work. -Load the whole chunk, as usual, and then force Excel to re-load the columns with non-text data types, by reasigning the whole row:
With ws.range(ws.Cells(4, 11), ws.Cells(ws.UsedRange.rows.Count, 11))
.NumberFormat = "dd/mm/yyyy"
.Value = .Value
End With
But this one starts to be quite slow when I need to reformat 30 columns on a document with ~7000 rows or so.
-Finally, I tried modifying the function, so that it creates an object[,] array, with the elements in its proper datatypes (datetime, int, and so on). Again, excel is horribly slow at unpacking generic object.
My only guess is that I could load the chunks by data type, but that messes horribly with my current methods.
Is there any alternative to either enhance speed or cut down on transformations?
Thanks in advance.
EDIT:
To clarify: The excel acts as a heavy client that allows several users (each one with its own excel) to read and update a remote database. The biggest problem was actually dealing with table locks for simultaneous writes and such. The problem now is that the amount of data is getting so large that I need to do something to speed up import and export times. I've already made the exporting times bearable (most of the time users don't upload that much data to the database), but the importing times are slow, and the main bottleneck is writting to the spreadsheet.
Furthermore, the users are already accustomed to a similar tool that they were using from another company before I came in, so they are very adamant about keeping the same format. That app sends a request to a webservice, from which it downloads a CSV with the info. Then loads that CSV as a data source into the excel. That approach is not valid for me because I don't have administrative access to the machine where the DB is, nor any other that I can use as a bridge for the WS. I only have administrative privileges to the database itself. I guess I could build the CSV from the queried data in the dll, and then use the same approach than them, though.