I am running VBA code on a large Excel spreadsheet. How do I clear the memory between procedures/calls to prevent an "out of memory" issue occurring?
-
1Its likely that you need to look at the structure of your VBA, and particularly any very large arrays/strings it might be working with - break these down into batches to limit the total usage of memory by your code at any one time. – Jon Egerton Jan 18 '13 at 10:34
-
Release Variant, Object whenever possible (Erase, Set object = nothing) . ReDim them to a more reasonable size, loop them in buffer size. But most probably reason is the spreadsheet too large (check in task manager if it's taking >500M RAM before running any macro) . **You may want to open a read-only spreadsheet, remove all unused Sheets** ( This will free a lot of memory) – Larry Jan 18 '13 at 10:37
-
You should show the code which is causing the problem. – Tim Williams Jan 18 '13 at 15:54
-
You could use 64bit office to get round the issue temporarily but you still need to address the problem like mentioned my most people. – Dreamwalker Jan 24 '13 at 14:20
-
7You might not even be having memory issues - I've encountered VBA reporting "Out of Memory" errors, when the root cause was a function in an add-on DLL I had written raising an exception, thinking VBA would report that to the user. Apparently "Out of Memory" can also be VBA's way of saying "I don't know WTF to do about this"! – Loophole Jul 21 '14 at 03:00
-
Loophole, can you expand on how you determined that the DLL was raising the exception? Did you use the windows event viewer? Seems others agree with you as well, curious how you identified the exception. – michaelf Apr 13 '20 at 23:19
7 Answers
The best way to help memory to be freed is to nullify large objects:
Sub Whatever()
Dim someLargeObject as SomeObject
'expensive computation
Set someLargeObject = Nothing
End Sub
Also note that global variables remain allocated from one call to another, so if you don't need persistence you should either not use global variables or nullify them when you don't need them any longer.
However this won't help if:
- you need the object after the procedure (obviously)
- your object does not fit in memory
Another possibility is to switch to a 64 bit version of Excel which should be able to use more RAM before crashing (32 bits versions are typically limited at around 1.3GB).

- 321,522
- 82
- 660
- 783
-
2Besides holding the least amount of reference in VBA, I think it's very important to reduce the amount of data in Excel. Remove all unused sheets, unused Cell, conditional formatting, coloring, validation, filter, e.t.c. (I had a case where the excel itself is using ~ 1.1GB of memory without any macro...) So it's important to state VBA & Excel itself share the 1.3GB of memory in Excel 2007 or below. – Larry Jan 18 '13 at 10:41
-
Absolutely - weird formatting in unused cells for example can cause a book to go from 30kB to several MB... – assylias Jan 18 '13 at 10:43
-
20@assylias - IMHO, adding `Set obj = Nothing` to the end of a procedure (where `obj` is dimmed in) won't free additional resources as the object is terminated by VBA garbage collector anyway. However, if `obj` is not needed anymore earlier in the sub, setting it to nothing will free the memory earlier. – Peter Albert Jan 21 '13 at 09:20
-
64 bits Excel has its issues too, i run on a 16GB Ram computer and excel has the audacity to say "out of memory" when nearly half (8GB) is still available. – Patrick Lepelletier Mar 17 '19 at 17:55
-
I'm having a similar issue: https://stackoverflow.com/questions/59617082/macro-fails-after-multiple-iterations – Gitty Jan 08 '20 at 20:53
I've found a workaround. At first it seemed it would take up more time, but it actually makes everything work smoother and faster due to less swapping and more memory available. This is not a scientific approach and it needs some testing before it works.
In the code, make Excel save the workbook every now and then. I had to loop through a sheet with 360 000 lines and it choked badly. After every 10 000 I made the code save the workbook and now it works like a charm even on a 32-bit Excel.
If you start Task Manager at the same time you can see the memory utilization go down drastically after each save.

- 161
- 1
- 2
-
-
2It seems that Excel basically purges the memory when you save the workbook. Pretty much like sql purges the transaction log when committing data. – Arne Larsson Nov 21 '15 at 15:22
-
2hey @ArneLarsson you save my day!!! This work like a charm!!! And it is true, Excel release memory after saving. – Elbert Villarreal Jul 27 '17 at 22:03
-
I know this is old but wanted to point out that Excel 2016+ no longer releases resources on save... this used to be a good work around! – Orin Moyer Jun 05 '19 at 12:51
-
Thanks for this tip. It partially worked for me. When I tried a save of the workbook from within a VBA loop itself, this had no effect on the memory leak - it was still steadily increasing. But luckily, we are launching old legacy VBA from C#, so what I did was tweaked the VBA mega-loop to instead run in chunks, and the C# code then runs a single chunk of VBA, and then saves the workbook, in every iteration. The cause-effect of saving isn't so immediate when you look at logs vs. task manager, but this definitely prevents the memory from just steadily climbing up to 3GB and then crashing Excel. – Colm Bhandal May 19 '21 at 10:47
-
@ColmBhandal, You can run 100% from C# using the Excel Application Object Model. The commands are all the same as the VBA ones. I'm not aware of any feature that isn't reachable from the C# side. https://learn.microsoft.com/en-us/previous-versions/office/troubleshoot/office-developer/automate-excel-from-visual-c – HackSlash Jul 28 '21 at 20:00
-
@HackSlash That is correct, and in fact we build all of our new functionality in C#. However, the VBA is legacy code. If we had the resources to convert all of our VBA legacy code to C#, that's exactly what we would do. But unfortunately the VBA comprises years of tech debt that we do not have the resources to fully replace with C#. Chunking it involved very little coding effort, involving only rewriting a top-level loop. Rewriting it wholly in C# would have involved converting many VBA modules into pure C#, which would have taken a long time, and wouldn't have been worth it for our use case. – Colm Bhandal Jul 29 '21 at 10:17
Answer is you can't explicitly but you should be freeing memory in your routines.
Some tips though to help memory
- Make sure you set object to null before exiting your routine.
- Ensure you call Close on objects if they require it.
- Don't use global variables unless absolutely necessary
I would recommend checking the memory usage after performing the routine again and again you may have a memory leak.

- 3,032
- 4
- 30
- 60
Found this thread looking for a solution to my problem. Mine required a different solution that I figured out that might be of use to others. My macro was deleting rows, shifting up, and copying rows to another worksheet. Memory usage was exploding to several gigs and causing "out of memory" after processing around only 4000 records. What solved it for me?
application.screenupdating = false
Added that at the beginning of my code (be sure to make it true again, at the end) I knew that would make it run faster, which it did.. but had no idea about the memory thing.
After making this small change the memory usage didn't exceed 135 mb. Why did that work? No idea really. But it's worth a shot and might apply to you.

- 51
- 1
If you operate on a large dataset, it is very possible that arrays will be used. For me creating a few arrays from 500 000 rows and 30 columns worksheet caused this error. I solved it simply by using the line below to get rid of array which is no longer necessary to me, before creating another one:
Erase vArray
Also if only 2 columns out of 30 are used, it is a good idea to create two 1-column arrays instead of one with 30 columns. It doesn't affect speed, but there will be a difference in memory usage.

- 2,296
- 4
- 23
- 52
I had a similar problem that I resolved myself.... I think it was partially my code hogging too much memory while too many "big things"
in my application - the workbook goes out and grabs another departments "daily report".. and I extract out all the information our team needs (to minimize mistakes and data entry).
I pull in their sheets directly... but I hate the fact that they use Merged cells... which I get rid of (ie unmerge, then find the resulting blank cells, and fill with the values from above)
I made my problem go away by
a)unmerging only the "used cells" - rather than merely attempting to do entire column... ie finding the last used row in the column, and unmerging only this range (there is literally 1000s of rows on each of the sheet I grab)
b) Knowing that the undo only looks after the last ~16 events... between each "unmerge" - i put 15 events which clear out what is stored in the "undo" to minimize the amount of memory held up (ie go to some cell with data in it.. and copy// paste special value... I was GUESSING that the accumulated sum of 30sheets each with 3 columns worth of data might be taxing memory set as side for undoing
Yes it doesn't allow for any chance of an Undo... but the entire purpose is to purge the old information and pull in the new time sensitive data for analysis so it wasn't an issue
Sound corny - but my problem went away

- 11
- 1
-
This seems odd... doesn't the undo only store user-entered commands? So on running your macro, wouldn't it be clear anyway? – Grade 'Eh' Bacon Nov 20 '15 at 14:55
I was able to fix this error by simply initializing a variable that was being used later in my program. At the time, I wasn't using Option Explicit in my class/module.

- 3,329
- 6
- 46
- 89