0

I'm trying to find a way to run a batch script on Windows that backs up my project directory to our local network file share server.

Example of what I would usually run:

robocopy /mir "C:\PROJECT_FOLDER_PATH" "\\NETWORK_FOLDER_PATH"

But, every now and then, my IT admin approaches me about a massive copy operation that is slowing down the network.

As my projects folder grows over time, this becomes more of an annoyance. I try to run the script only while signing off later in the day to minimize the number of people affected in the office but, I was trying to come up with a better solution.

I've written a script that uses 7zip to create a 7zip archive and splits it into volumes of 250MB. So now I have a folder that just contains several smaller files and no folders to worry about. But, if I batch copy all of these to the server, I'm concerned I'm still running into the same problem.

So my initial idea was to run copy one file at a time every 5-10sec. rather than all at once. But I would only want the script to run once. I know I could write a loop and rely on robocopy's /mir tag to skip files that have already been backed up, but I don't want to have to monitor the script once I start it.

I want to run the script when I'm ready to do a backup and then have it copy the files up to the network at intervals to avoid over taxing our small network.

KCCLEMO
  • 63
  • 1
  • 8
  • 1
    Possible duplicate of [How do I create a batch file timer to execute / call another batch throughout the day](https://stackoverflow.com/questions/299392/how-do-i-create-a-batch-file-timer-to-execute-call-another-batch-throughout-th) – Leonardo Alves Machado Aug 03 '17 at 13:32
  • 3
    Most likely you want to use `version control` instead of copying project folders all over the place. Really. Really really. – Alejandro Aug 03 '17 at 13:33
  • 2
    why not use version control such as git? – buncis Aug 03 '17 at 13:39
  • Version control would be over kill in this case. These are projects that I alone work on, and after the project is finished, the files are deleted. I'm not talking source code, I do use version control for all of my web dev projects for work. But, in this case, I am simply referring to projects outside of dev that I also work on. Video editing, image editing, material files for architectural software I support, etc. many of us copy/backup to our own personal employee folder out on the network, but doing so in large quantities like this seems to slow things to a crawl. – KCCLEMO Aug 04 '17 at 14:14

1 Answers1

0

Robocopy has a special option to throttle data traffic while copying.

/ipg:n - Specifies the inter-packet gap to free bandwidth on slow lines.

The number n is the number of milliseconds for Robocopy to wait after each block of 64 KB. The higher the number, the slower Robocopy gets, but also: the less likely you will run into a conflict with your IT admin.

Example:

robocopy /mir /ipg:50 "C:\PROJECT_FOLDER_PATH" "\\NETWORK_FOLDER_PATH"

On a file of 1 GB (about 16,000 blocks of 64 KB each), this will increase the time it takes to copy the file with 800 seconds (16,000 x 50 ms).

Suppose it normally takes 80 seconds to copy this file; this might well be the case on a 100 Mbit connection. Then the total time becomes 80 + 800 = 880 seconds (almost 15 minutes). The bandwidth used is 8000 Mbit / 880 sec = 9.1 Mbit/s. This leaves more than 90 Mbit/s of bandwidth for other processes to use.

Other options you may find useful:

/rh:hhmm-hhmm - Specifies run times when new copies may be started.
/pf - Checks run times on a per-file (not per-pass) basis.

Source:

Ruud Helderman
  • 10,563
  • 1
  • 26
  • 45