0

below is my code, which isn't perfect but working, it take more than 30min to loop through a text file. how to improve the speed of looping through file by using other code or method. please help.

Open "C:\Users\steven.EOPESTATE\Desktop\Sharp Sales\TRMSAVE01.txt" For Input As #1

            Do Until EOF(1)
                Dim ITEMSQL As String
                Line Input #1, varLine
                testvarline = Split(varLine, ",")

                If testvarline(0) = "$ITEM" Then
                'Debug.Print testvarline(0), testvarline(1), testvarline(2), testvarline(3), testvarline(4), testvarline(5), testvarline(6), testvarline(7), testvarline(8), testvarline(9)
                testvarline(0) = Replace(testvarline(0), "$", " ")
                testvarline(7) = Replace(testvarline(7), ",,", " ")

                ITEMSQL = "Insert into SalesItem([ITEMID], [2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12]) Values (" & Chr(34) & "" & (testvarline(0)) & "" & Chr(34) & "," & (testvarline(1)) & "," & (testvarline(2)) & "," & (testvarline(3)) & "," & (testvarline(4)) & "," & (testvarline(5)) & "," & Chr(34) & "" & (testvarline(6)) & "" & Chr(34) & "," & (testvarline(9)) & "," & (testvarline(10)) & "," & (testvarline(11)) & "," & (testvarline(12)) & "," & (testvarline(14)) & ")"

                Debug.Print ITEMSQL
                DoCmd.RunSQL ITEMSQL
                DoCmd.SetWarnings False
                DoCmd.Echo False




                End If

            Loop

    Close #1
Steve
  • 305
  • 2
  • 6
  • 16
  • 1
    Is looping and reading the file really the slow part? I suspect the SQL inserts are where you are spending the most time. – aphoria Jan 02 '14 at 16:47
  • Why not just import the file into a new table, then do an insert query? – Joe Jan 02 '14 at 16:56
  • I don't see any problem with reading the file however inserting each record int the db takes time. you can use DoCmd.TransferText command to import. – user3123529 Jan 02 '14 at 17:11
  • You can measure the time spent in the different parts of the loop by using the MicroTimer function from [this page](http://msdn.microsoft.com/en-us/library/ff700515(v=office.14).aspx#Office2007excelPerf_MakingWorkbooksCalculateFaster). That will allow you to determine which part needs optimization: SQL or File IO. Remove the Debug.Print statement when you can, that also contributes to slowing things down. – Blackhawk Jan 02 '14 at 18:03
  • Um, home many lines are there in the file? – John Alexiou Jan 02 '14 at 20:21

1 Answers1

1

One of the slowest things you can do on a pc is load from the hard drive. The CPU and RAM are incredibly much faster than the hard drive (see this SO question), so they end up waiting for a hard disk read to complete. Because of this, reading the entire file into memory at once will speed up your program significantly.

Rather than use the Split function again to split up the lines for processing, I recommend parsing them yourself by using the Mid function, which actually avoids a lot of the overhead of normal string processing because it doesn't create a temporary copy of the substring in memory. Here is UNTESTED example code:

Dim strFileText As String
Dim lngCurrIndex As Long
Dim lngEndOfLine As Long

Open "C:\Users\steven.EOPESTATE\Desktop\Sharp Sales\TRMSAVE01.txt" For Binary As #1

strFileText = Space(LOF(1)) 'create space for the whole file'

Get 1, , strFileText        'read in the whole file at once'
lngCurrIndex = 1

Do While lngCurrIndex < Len(strFileText)
    Dim ITEMSQL As String

    lngEndOfLine = InStr(lngCurrIndex, strFileText, vbCrLf, vbBinaryCompare) 'find the end of this line. NOTE: This assumes that this text files uses the MS convention of CrLf line endings'

    varLine = Mid(strFileText, lngCurrIndex, lngEndOfLine - lngCurrIndex) 'get the line'

    lngCurrIndex = lngEndOfLine + 2 'set lngCurrIndex to the start of the next line'

    testvarline = Split(varLine, ",")

    If testvarline(0) = "$ITEM" Then
        'Debug.Print testvarline(0), testvarline(1), testvarline(2), testvarline(3), testvarline(4), testvarline(5), testvarline(6), testvarline(7), testvarline(8), testvarline(9)'
        testvarline(0) = Replace(testvarline(0), "$", " ")
        testvarline(7) = Replace(testvarline(7), ",,", " ")

        ITEMSQL = "Insert into SalesItem([ITEMID], [2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12]) Values (" & Chr(34) & "" & (testvarline(0)) & "" & Chr(34) & "," & (testvarline(1)) & "," & (testvarline(2)) & "," & (testvarline(3)) & "," & (testvarline(4)) & "," & (testvarline(5)) & "," & Chr(34) & "" & (testvarline(6)) & "" & Chr(34) & "," & (testvarline(9)) & "," & (testvarline(10)) & "," & (testvarline(11)) & "," & (testvarline(12)) & "," & (testvarline(14)) & ")"

        Debug.Print ITEMSQL
        DoCmd.RunSQL ITEMSQL
        DoCmd.SetWarnings False
        DoCmd.Echo False
    End If
Loop
Close #1

Depending upon the size of your file, it may not be possible to load it all into memory at once. If that is the case, you can split it into large chunks and do them one at a time.

Community
  • 1
  • 1
Blackhawk
  • 5,984
  • 4
  • 27
  • 56
  • I tried your code and ending up with runtime error 54, which show the "Get 1, , strFileText" highlighted in yellow. – Steve Jan 03 '14 at 08:53
  • @Steve Ah, sorry about that, change the Open mode from `Input` to `Binary`: `Open "C:\Users\steven.EOPESTATE\Desktop\Sharp Sales\TRMSAVE01.txt" For Binary As #1` – Blackhawk Jan 03 '14 at 15:12