1

The process is getting slower after each iteration.

I am trying to open all files(as temp file) kept in a particular folder, compute the sum of three columns in each file, assigning the values to the main sheet and then close the temp file.

Sub Compute()

Dim File_Check As Date
Dim rcell As Range
Dim temp as Workbook

Set ws = ThisWorkbook.Sheets("DATA")

Application.ScreenUpdating = False

For dt = #6/5/2017# To now

' Name of the file is like data02052017.DAT
myFilenm = "D:\data" & Format(dt, "ddmmyyyy") & ".DAT"

If Dir(myFilenm) <> "" Then ' To exclude non-working days since data is not generated on those days
    ws.Range("A" & i).Value = myFilenm
    ws.Range("b" & i).Value = dt

    Set temp = Workbooks.Open(Filename:=myFilenm)
    Set ws2 = temp.ActiveSheet
    ws2.Range("A:A").Select


    ' replacing space from each cell in temp workbook
    Selection.Replace What:=" ", Replacement:="", LookAt:=xlPart, _
    SearchOrder:=xlByRows, MatchCase:=False, SearchFormat:=False, _
    ReplaceFormat:=False


    ' text to column, seperating by comma
    Selection.TextToColumns Destination:=ActiveCell, DataType:=xlDelimited, _
    TextQualifier:=xlDoubleQuote, ConsecutiveDelimiter:=False, Tab:=False, _
    Semicolon:=False, Comma:=True, Space:=False, Other:=False, OtherChar _
    :=":", FieldInfo:=Array(Array(1, 1), Array(2, 1), Array(3, 1), Array(4, 1), Array(5, _
    1), Array(6, 1), Array(7, 1), Array(8, 1), Array(9, 1), Array(10, 1), Array(11, 1), Array(12 _
    , 1), Array(13, 1), Array(14, 1), Array(15, 1), Array(16, 1), Array(17, 1), Array(18, 1), _
    Array(19, 1), Array(20, 1), Array(21, 1), Array(22, 1), Array(23, 1), Array(24, 1), Array( _
    25, 1), Array(26, 1)), TrailingMinusNumbers:=True


    ' Some intendent Computions
    ws.Range("c" & i).Value = Application.WorksheetFunction.Sum(ws2.Range("P:P"))
    ws.Range("D" & i).Value = Application.WorksheetFunction.Sum(ws2.Range("Q:Q"))
    ws.Range("E" & i).Value = Application.WorksheetFunction.Sum(ws2.Range("W:W"))
    ws.Range("F" & i).Value = Format(Now, "HH:MM:SS")
    i = i + 1
    temp.Close SaveChanges:=False




End If


Next dt
Application.ScreenUpdating = True
End Sub

the 1st iteration is taking 5 seconds were as the 100th iteration is taking 25 seconds to complete and eventually the process time is going into minutes per iteration. I have around 500 records and each record is having approx. same size so i wanted to understand why is the iteration time increasing exponentially and what can be done to resolve the issue.

Pᴇʜ
  • 56,719
  • 10
  • 49
  • 73
LOKE2707
  • 312
  • 1
  • 5
  • 19
  • Have you tried unbiding your variables in each loop? Something like `Set Temp=Nothing` and `Set ws2 =nothing` before binding again in each loop – Foxfire And Burns And Burns Feb 19 '19 at 14:21
  • Yes, I have tried unbinding too. Same observations – LOKE2707 Feb 19 '19 at 14:28
  • Unbinding won't matter in this instance https://stackoverflow.com/questions/51065566/what-are-the-benefits-of-setting-objects-to-nothing – urdearboy Feb 19 '19 at 14:29
  • First step is to determine which line or lines of code are slow. Declare a `counter` variable and set it equal to 1 before the loop. Then increase it's value at the end of each loop. Then create an if statement where if `counter` >= 100, then enter the loop and set a break point in there. Then run your code and you can step through the 100th iteration and to see which part of your code is causing the slowdown. – LetEpsilonBeLessThanZero Feb 19 '19 at 14:42

1 Answers1

1

Office is rubbish at Memory Management. Or, rather it is being "clever", instead of doing what it is told.

Open a new workbook in Excel and Task Manager, (Ctrl+Shift+Esc), and observe the Memory that Excel is using. Open a saved file, close the file.

The memory Excel is using goes Up when you open the file, and Down when you close it - but not back to the level it was at before you opened the file. This is because Office is storing a "temporary" version of the file in memory, so that it can be reopened faster.

You can make this "bloat" slightly smaller by opening the file as Read Only (Set temp = Workbooks.Open(Filename:=myFilenm, ReadOnly:=True)), but the only way I have found to to "reset" it is to completely close Excel and reopen it.

So, by file 100, your Excel application has 99 documents worth of junk data in memory slowing everything down.

(N.B. This "feature" was, so far as I can tell, introduced in Office 2007, and completely absent in Office 2003)

Chronocidal
  • 6,827
  • 1
  • 12
  • 26
  • Maybe @user9977436 could open separate instances of Excel? – Vincent G Feb 19 '19 at 16:23
  • well, in control panel memory usage seems to be increasing with the number of iteration. "Set temp = Workbooks.Open(Filename:=myFilenm, ReadOnly:=True" did not work. I even tried to keep the loop in different sub/function and tried to call from the main sub, but no help. – LOKE2707 Feb 19 '19 at 17:52
  • @user9977436 You may need to break the task down (e.g. blocks of 50 files) and save the file, close Excel, then reopen the file between blocks. This is something that people have been complaining to Microsoft about for 12 years, and is still not fixed. As Vincent suggested, binding a new Excel Application with `appExcel = CreateObject("Excel.Application")` to open the workbooks in, then closing it (`appExcel.Quit`) and creating a new instance every so often (after 20-30 worksheets processed?) *might* fully dump that memory clog. – Chronocidal Feb 19 '19 at 19:27