9

I am writing a batch script that goes through every file in a directory looking for code files and modifies them in some way. After I finished that task I tried to run it on a large directory with about 6,000 files. About 40 minutes in the script crashed and I got lots of out of memory errors from the command prompt, running the script while looking at the task manager showed that my program was eating memory at about a rate of 1MB per loop iteration. So naturally thinking I had done something wrong I cut out all the code that I had written to isolate the problem. But then I was left with an empty file except for a for loop and the problem still persisted!

Here is what I ran on a fairly large directory like I said:

@echo off
setlocal ENABLEEXTENSIONS ENABLEDELAYEDEXPANSION
set directory=%CD%
for /R "%directory%" %%a in (*.c *.cpp *.h *.idl) do (
    set currentDir=%%~dpa
    pushd !currentDir!
    popd
)

I have actually been able to trim it down to:

@echo off
for /R "%CD%" %%a in (*) do echo

And the problem still persists.

Is there a memory leak in the batch for loop or am I just doing something wrong?

I am running Windows XP 32bit Service Pack 2 although I have tested and confirmed the problem is still present in Service Pack 3.

Maynza
  • 748
  • 5
  • 18
  • What is the point popd right after the pushd command? That does not make sense to me. You will accomplish nothing from that (just going back and forth between 2 directories). – steenhulthin Jun 13 '11 at 12:51
  • I am emulating what my actual code does. In the full file there is alot of stuff between the push and the pop. – Maynza Jun 13 '11 at 12:52
  • ok, I think you should add ... where there is code not shown. – steenhulthin Jun 13 '11 at 12:54
  • Oh, I misread that part of your question. That you took it out. – steenhulthin Jun 13 '11 at 12:56
  • I was just trying to simplify the code down to the smallest block that still has the leak. If you prefer you can add echo !currentDir! between the push and the pop. – Maynza Jun 13 '11 at 12:58
  • Is the leaks still there if you remove both the pushd and popd? It looks like that on my machine. – steenhulthin Jun 13 '11 at 13:13
  • Yes it is, although it is harder to test. – Maynza Jun 13 '11 at 13:26
  • Can you add what windows version you are using in the question? It might be relevant. – steenhulthin Jun 13 '11 at 13:31
  • I ran your code with no errors. Can you show us the actual error messages you are seeing? – aphoria Jun 13 '11 at 13:45
  • There aren't errors, it runs out of memory. If you have sufficient memory and not a large directory there is no problem. But in a directory with 6,000 files and directories and a system with 2GB of ram it runs out of memory. For a test case you can create a directory about 2 layers down, fill it with a few blank.c files then copy it over and over again then go up a level and copy that folder over and over again then get to the top level and copy that one over and over. Then run it while watching cmd.exe on the task manager. – Maynza Jun 13 '11 at 13:51
  • How do you know it's running out of memory if there are not any error messages? – aphoria Jun 13 '11 at 13:53
  • Because you can watch the memory used increase linearly and never be released and when run in my actual script it ran for 40 minutes then ran out of memory and gave me windows error code 8 ERROR_NOT_ENOUGH_MEMORY 8 (0x8) Not enough storage is available to process this command. – Maynza Jun 13 '11 at 13:56
  • Where do you see that error message? In the command prompt window? – aphoria Jun 13 '11 at 14:37
  • Yes, or, if I redirect errors when running the command, in my error log file. – Maynza Jun 13 '11 at 14:40
  • Weird. I just ran it with no errors on an XP SP3 machine for all files starting in the root of C:. – aphoria Jun 13 '11 at 14:45
  • It could be a result of some other thing on my systems I suppose. I work in an area where the computers are really locked down. – Maynza Jun 13 '11 at 14:48

3 Answers3

3

This is not so much an answer as a workaround (or rewrite). I have run your script and I see some memory consumption, which some of which is not released until the script finishes, but I do not get anything near 1 MB per iteration. (Just like @aphoria I can run the script on the root of c: without problems on XP SP3 and Vista.)

I suggest that you closing any process not necessary for the script and run your script on one subdirectory at a time.

If you cannot solve this in any other manner I suggest you try writing the script in Powershell.

steenhulthin
  • 4,553
  • 5
  • 33
  • 52
1

The problem lies in the fact that CMD.EXE and COMMAND.COM set aside a specific amount of memory for usage. Generally when using batch files to do a lot of processing, you want to increase the memory workload that the script can use.

Depending on the OS Environment you are running, you can do the following:

WINDOWS NT/2000/XP: You can alter COMMAND.COM environment memory with the /e switch, for example:

command.com /e:2048 /c BatchFile.BAT

will run BatchFile.BAT in a shell with 2048 bytes environment memory.


WINDOWS 95/98/ME: You can change the environment memory allocated in a particular MS-DOS window or in the Shortcut to a specific Batch file. Open the Properties dialogue (right-click the Shortcut or Desktop icon, and click Properties). Click the Memory Tab. Use the initial environment pull-down box to set a size that's enough for your variables, say 2048 bytes (or more if you wish).

To be safe, I'd increase your disk memory usage as well.

Anthony Miller
  • 15,101
  • 28
  • 69
  • 98
0

Seems for /R consumes memory for the internal files list. If run from a drive root it would show the memory consumption because a drive root basically has pretty many files recursively.

This one shows much more greater consumption:

@echo off

call :OUT_OF_MEMORY ^^
exit /b

:OUT_OF_MEMORY
call set __DUMMY=%%1

Use on your own risk, it can be unstoppable. :)

PS: Reproduces on Windows 7

Andry
  • 2,273
  • 29
  • 28