2

I have written a code that finds all files starting with a specific name and reads data from them, there is usually 1k files or more in the folder, I wrote a little benchmark and realize that my code reads approx 1 file per second and that is a lot of time. I am pretty new to VBA, and I was wondering if I took a wrong approach to this? Function Code:

Function ReadDataFromWorksheet()
Dim XL As Excel.Application
Dim WBK As Excel.Workbook
Dim i As Integer

i = 1

Set XL = CreateObject("Excel.Application")

Do While i < (ArraySize + 1)
    Set WBK = XL.Workbooks.Open("PATH TO FILE")
    Array(i).Data1 = WBK.ActiveSheet.Range("F6").Value
    WBK.Close SaveChanges:=False
    i = i + 1
Loop

Set XL = Nothing
End Function

Sorry for my bad spelling!... and thank you in advance for the help!

Aka
  • 79
  • 9
Uroš Grum
  • 138
  • 2
  • 14

1 Answers1

0

Here is how you should use the Dir :

Function ReadDataFromWorksheet() As Variant

With Application
    .EnableEvents = False 'stop executing this code until we are done
    .DisplayAlerts = False
    .ScreenUpdating = False
    '.Calculation = xlCalculationManual
End With

Dim XL As Excel.Application
Dim WBK As Excel.Workbook
Dim FileName As String, _
    FolderPath As String, _
    Results()
ReDim Results(0)

On Error Resume Next
Set XL = GetObject(, "Excel.Application")
If Err.Number > 0 Then Set XL = CreateObject("Excel.Application")
On Error GoTo 0

FolderPath = "C:/test/"
FileName = Dir(FolderPath & "*.xlsx")

Do While FileName <> ""
    Set WBK = XL.Workbooks.Open(FolderPath & FileName)
    Results(UBound(Results)) = WBK.ActiveSheet.Range("F6").Value
    WBK.Close SaveChanges:=False
    ReDim Preserve Results(UBound(Results) + 1)
    FileName = Dir
Loop
ReDim Preserve Results(UBound(Results) - 1)

Set WBK = Nothing
Set XL = Nothing

With Application
    .EnableEvents = True
    .DisplayAlerts = True
    .ScreenUpdating = True
    '.Calculation = xlCalculationAutomatic
End With

ReadDataFromWorksheet = Results
End Function
R3uK
  • 14,417
  • 7
  • 43
  • 77
  • thank you for the answear! but I think you misunderstood, my code works fine, my program works and i have the dir loop to get all the paths to my files, the problem is that with the code i posted reading the files is slow, i read aprox. 135 files per minute and in the folder there is 1k+ files ... i was just wondering if there is a better/faster way to do that or is this the only way ? – Uroš Grum Nov 19 '15 at 14:47
  • 1
    @UrošGrum opening/closing files will always be slow. You can speed up some things by setting ScreenUpdating to false and turning calculations to manual before starting your loop. Otherwise, Excel isn't the best platform to quickly open/close workbooks. You might save some time by opening them in batches... but that code will be a bit more complicated. – SpaceSteak Nov 19 '15 at 15:00
  • @UrošGrum : As SpaceSteak said, opening files will always be pretty slow even for little/light files! So I added the code to deactivate most of the useless things to free some memory for Excel, but that won't be incredibly more efficient! Just kept the calculation mode as comment, I let you choose for this! ;) – R3uK Nov 19 '15 at 15:13
  • Thank you both, that pretty much answears my question, i was just affraid i took the wrong approach to the problem that is why i rather asked and i think the person i am doing this wont really mind the wait as he can just run at his computer and the files are located on a network drive, id rather keep the screen refresh as that way i can add a little procentage window so he does not think the program has frozen. – Uroš Grum Nov 20 '15 at 07:19
  • You could use a progress bar too, even if the screen updating is turn off! ;) http://stackoverflow.com/questions/5181164/progress-bar-in-vba-excel Enjoy SO! ;) – R3uK Nov 20 '15 at 07:23