2

The loop through all subfolders interrupts at not forseeable positions. No error message is thrown by VBA. The file being treated at the moment the code halts is different from time to time.

The files to be opened are raw, non-ascii files. Excel can convert them into csv files for further batch processing. I tried both of the following loops with the same result: 1. Loop Through All Subfolders Using VBA 2. Cycle through sub-folders and files in a user-specified root directory

Sub LoopFolders()
    Dim FileSystem As Object
    Dim HostFolder As String

    HostFolder = "c:\abc\Donnees_de_production\"

    Set FileSystem = CreateObject("Scripting.FileSystemObject")
    Application.ScreenUpdating = False          ' Deactivate Screenupdating
    DoFolder FileSystem.GetFolder(HostFolder)
    Application.ScreenUpdating = True
End Sub


Sub DoFolder(Folder)
    Dim SubFolder As Object
    For Each SubFolder In Folder.SubFolders
        DoFolder SubFolder
    Next
    Dim File As Object
    For Each File In Folder.Files
        ' Operate on each file
        Workbooks.Open FileName:=File.Path
        ' MsgBox File.Path & ".csv"
        ActiveWorkbook.SaveAs FileName:=File.Path & ".csv", FileFormat:=xlCSV
        ActiveWorkbook.Close
    Next
End Sub

I have about 6500 files to convert to csv. The execution stops after around 1200. I have no idea why.

I highly appreciate any help or ideas! Thank you.

L. Stein
  • 21
  • 2
  • Do 1000, stop, reset do the next 1000... – Solar Mike Jan 08 '19 at 10:18
  • 1
    I may be overlooking something here - but, you don't seem to be running any file-type checks to prevent yourself from re-processing the CSV files in your loop? Have you tried keeping an eye on your memory usage as the script runs. – Chronocidal Jan 08 '19 at 10:19
  • 1
    I would replace the code that loops on Files with a Debug.Print(File.Path) and see if it runs fine without opening and changing the workbooks – A9S6 Jan 08 '19 at 10:20
  • You could Debug.Print each SubFolder and File to see where it ends. Also the subfolder loop seems separate to the file loop. Shouldn't the file loop be within the subfolder loop? – Jason Stallard Jan 08 '19 at 10:21
  • @chronocidal I think that shouldnt be a problem as the file list is queried before the creation of the CSV files starts. – L. Stein Jan 08 '19 at 10:25
  • @Chronocidal Memory keeps stable. So far no CSV files were double-processed. – L. Stein Jan 08 '19 at 10:33
  • @A9S6 Thank you. The file at which the processing ends differs each time. Moreover, while applying the procedure to convert the first 1000, then the next 1000, the files it got stuck before are treated without a problem in the second round. Weird side effect: With the Debug.Print(File.Path) in the loop, only about 30 files are treated before execution halts. – L. Stein Jan 08 '19 at 10:38
  • `With the Debug.Print(File.Path) in the loop, only about 30 files are treated before execution halts` That's weird. To find the code that is causing the issue I think you should keep removing the code e.g. Files-Loop until it passes for all files and start from there – A9S6 Jan 08 '19 at 10:45
  • No, the file list is ***not*** queried before the creation of the CSV files. `For Each File` is calling the `Folder.Files` iterator. You can verify this by following JasonStallard's suggestion and replacing the file operation with a `Debug.Print`. – Comintern Jan 08 '19 at 14:08
  • Does it work correctly if you comment out the "recursiveness"? So that (for test purposes) it runs through only one folder? I see two potential issues with "orphaned objects" accumulating during the recursive execution: 1) The object `File` is never being released. Try putting `Set File = Nothing` before `End Sub`. 2) Assign a `Workbook` object to the workbook being opened and use that instead of `ActiveWorkbook`, then also set that to Nothing at the end of the procedure. – Cindy Meister Jan 08 '19 at 14:24

0 Answers0