0

Could someone suggest what to fix with this code?

The first version works but was consuming disk and RAM about 90% when merging.

Here's the first version:

Private Sub BackgroundWorker1_DoWork(sender As Object, e As DoWorkEventArgs) Handles BackgroundWorker1.DoWork
    Dim counter As Double = 1
    Dim inputFolder As String = txtSource.Text
    Dim outputFolder As String = txtFilePath.Text + "\" + txtFileName.Text + ".ts"

    Using outputStream As FileStream = New FileStream(outputFolder, FileMode.Create)

        Dim files = Directory.GetFiles(inputFolder, "*.ts")

        For Each inputFiles In files

            Using inputFilestream As FileStream = New FileStream(inputFiles, FileMode.Open)

                'progress
                Dim value As Double = (counter / files.Length) * 100
                BackgroundWorker1.ReportProgress(CInt(value))

                'process file
                inputFilestream.CopyTo(outputStream)

                counter += 1

            End Using
        Next
    End Using
End Sub

Then I tried copying the file in small pieces with the intention of using less RAM:

Private Sub CopyMyFiles()
    Dim inputFolder As String = txtSource.Text
    Dim outputFolder As String = txtOutputFolder.Text + "\" + txtOuputFilename.Text + ".ts"

    Dim enumFiles = Directory.EnumerateFiles(inputFolder, "*.ts").ToArray

    For Each files In enumFiles

        Dim input As FileStream = New FileStream(files, FileMode.Open, FileAccess.ReadWrite, FileShare.Read, FileOptions.SequentialScan)
        Dim output As FileStream = New FileStream(outputFolder, FileMode.Create, FileAccess.Write, FileShare.ReadWrite, FileOptions.SequentialScan)

        CopyStream(input, output)

    Next

End Sub

Public Shared Sub CopyStream(inputStream As Stream, outputStream As Stream)
    Dim buffer = New Byte(1025) {}
    Dim bytesRead As Integer
    bytesRead = inputStream.Read(buffer, 0, buffer.Length)
    While bytesRead > 0
        outputStream.Write(buffer, 0, bytesRead)
        bytesRead = inputStream.Read(buffer, 0, buffer.Length)
    End While
    outputStream.Flush()
    inputStream.Close()
    outputStream.Close()
End Sub

but that isn't working - it doesn't appear to be merging the input files.

Andrew Morton
  • 24,203
  • 9
  • 60
  • 84
zackmark29
  • 15
  • 4
  • Calling `GetFiles`, then using a `For Each` loop, then using a separate counter does not make sense. If you're going to use a `For Each` loop then call `EnumerateFiles`. If you call `GetFiles` and have an array, use a `For` loop and then you already have a counter. That's not an answer to your question, just a general tip. – jmcilhinney May 18 '20 at 08:32
  • Alright thanks for the info. I think I haev an idea about enumerate files. I'll try if I can fix – zackmark29 May 18 '20 at 08:37
  • I tried enumeratefiles but still getting 90% ram usage when merging – zackmark29 May 18 '20 at 10:01
  • @zackmark29 Often you would want a program to run as quickly as possible, which should be what is happening with it using the disk and RAM. To use less RAM, you could [Copy a file in little chunks](https://stackoverflow.com/a/15561324/1115360). – Andrew Morton May 18 '20 at 12:33
  • Thank you for that source. I tried it but, I'm having a little problem. the output isn't merging the files you can see the code below – zackmark29 May 19 '20 at 06:23
  • @zackmark29 You need to move the output stream code outside of the loop, like you had in the first version of the code. Then you need to remove the lines `outputStream.Flush()` and `outputStream.Close()` from `CopyStream`. – Andrew Morton May 19 '20 at 11:45
  • @zackmark29 Oh, and it's `Dim buffer = New Byte(1023)` to allocate a 1024-byte array. You might find a 32 KB array (32767) gives somewhat better performance. – Andrew Morton May 19 '20 at 11:48

0 Answers0