0

I need to copy a lot of file between many file systems into one big storage. I also need to limit the bandwidth of the file transfer because the network in not stable and I need the bandwidth for other things. Another request is that it will be done in c#.

I thought about using Microsoft File Sync Framework, but I think that it doesn't provide bandwidth limitations. Also thought about robocopy but it is an external process and handling the error might be a little problem. I saw the BITS but there is a problem with the scalability of the jobs, I will need to transfer more then 100 files and that means 100 jobs at the same time.

Any suggestions? recommendations?

Thank you

Lee
  • 781
  • 2
  • 11
  • 31

2 Answers2

0

I'd take a look at How to improve the Performance of FtpWebRequest? though it might not be what you're looking for exactly, it should give you ideas.

I think you'll want some sort of limited tunnel so the processes negiotating can't claim more bandwidth because there is none available for them. A connection in a connection.

Alternatively you could make a job queue, which holds off on sending all files at the same time but instead sends n number of files and waits until one is done before starting the next.

Community
  • 1
  • 1
G_V
  • 2,396
  • 29
  • 44
0

Well, you could just use the usual I/O methods (read + write) and throttle the rate.

A simple example (not exactly great, but working), would be something like this:

while ((bytesRead = await fsInput.ReadAsync(...)) > 0)
{
  await fsOutput.WriteAsync(...);

  await Task.Delay(100);
}

Now, obviously, bandwidth throttling isn't really the job of the application. Unless you're on a very simple network, this should be handled by the QoS of the router, which should ensure that the various services get their share of the bandwidth - a steady stream of data will usually have a lower QoS priority. Of course, it does usually require you to have a network administrator.

Luaan
  • 62,244
  • 7
  • 97
  • 116