1

I'm creating a VB.Net application that stores data in a SQLite backend. The data comes in excel workbooks that the user can import from. Each workbook has one worksheet (about 30,000 rows) that gets reformated a bit and imported to a new table. What's the most efficient way to do this?

I'm currently reading in the entire range from Excel into a 2D array. Looping over the rows in this array and adding each row to a long SQL statement that gets executed every thousand rows. But this is painfully slow both on the looping through the array bit and the pushing to the SQLite step. I can't help but think there must be a more efficient means of doing this.

Thanks,

Code below: 'First open the xls reformater book and read in our data Dim xlApp As New Excel.Application Dim xlWorkBook As Excel.Workbook Dim xlWorkSheet As Excel.Worksheet

    xlWorkBook = xlApp.Workbooks.Open(strFile)
    xlWorkSheet = xlWorkBook.Worksheets("ToDSS")

    Dim r As Excel.Range = xlWorkSheet.UsedRange
    Dim array(,) As Object = r.Value(Excel.XlRangeValueDataType.xlRangeValueDefault)

    xlWorkBook.Close()
    xlApp.Quit()

    releaseObject(xlApp)
    releaseObject(xlWorkBook)
    releaseObject(xlWorkSheet)

    SQLconnect.Open()
    SQLcommand = SQLconnect.CreateCommand

    'now loop through the rows inserting each into the db

    Dim curDate As Date

    strSQL = ""
    Dim batch As Integer = 0

    For row As Integer = 16 To array.GetUpperBound(0)
        strSQL += "INSERT INTO scenario_" & strScenarioName & " VALUES ('"
        curDate = array(row, 1)
        strSQL += curDate.ToString("yyyy'-'MM'-'dd") + "'"

        For col = 2 To 30
            strSQL += ", " & array(row, col)

        Next
        strSQL += " );" & vbCrLf

        If batch > 1000 Or row = array.GetUpperBound(0) Then
            Debug.Print(Str(row))
            SQLcommand.CommandText = strSQL
            SQLcommand.ExecuteNonQuery()
            Debug.Print("pushed")
            strSQL = ""
            batch = 0
        Else
            batch += 1
        End If

    Next
    SQLcommand.Dispose()
    SQLconnect.Close()
Colin Talbert
  • 454
  • 6
  • 20

2 Answers2

2

Export to a csv format and use bulk load.

Chriseyre2000
  • 2,053
  • 1
  • 15
  • 28
  • This might work. I'm not familiar with bulk load and can't find any references to it relative to sqlite and VB.Net Thanks – Colin Talbert Feb 10 '12 at 23:17
0

Perhaps slightly round about, but you could import your file into Access instead of Excel and then export to SQLite.

There's an ODBC driver for SQLite:

http://www.ch-werner.de/sqliteodbc/

Using that you can export from Access to SQLite. Here's an example of doing it from the GUI. My base assumption is this is programmable or can be automated (perhaps from VBA) as well.

http://support.microsoft.com/kb/200427

I assume it will give you minimal formatting options but it should be a rather clean (and immediate) way of doing it. The driver is open source so you could extend it if you are proficient enough in C.

Max
  • 96
  • 6
  • This is something the end user will have to do as the data is natively in excel format. Going through access first would be a bit of a pain. – Colin Talbert Feb 10 '12 at 23:14
  • Unfortunately I haven't been able to find an easier way than this myself. The only thing I haven't done yet is tried automating it. Where I work we have CAD modelers that do a lot of data dumps to access since that's what the industry knows apparently. The data ends up going to Oracle in a round about fashion. It's a pretty big mess all around. – Max Feb 13 '12 at 19:21
  • I have coded a .Net web app to import Excel spreadsheets to a SQL database then the code will interact with a SQLite database to push the data back and forth, but there are major issues with carriage returns in the Excel cells. Web services strip out CRLF in exchange for just LF. Excel likes porting each LF to a new cell, so it was a bit tricky to get it to work. Especially is the cell requires an LF on the end of the record. – htm11h Jan 04 '18 at 16:16