3

I feel kind of lazy asking this one, but I don't seem to be able to summon up the correct google query to find answers to the question(s) I have.

A bit of background. I have an app that will monitor other processes for unhandled exceptions and crashes etc.. When triggered, this app gathers system info and creates a memory.dmp file using MiniDumpWriteDump.

We'd now like this process monitor app to upload the crash data to a server, but obviously the memory.dmp files can be massive, this is undesirable for upload. So we find that we can either reduce the size of the memory.dmp when we create it (potentially making the memory.dmp useless if we don't include that vital bit of info) or end up having to upload massive files.

Is there anyway, after we have created the memory.dmp, that it can be opened, some initial analysis done (i know this bit is possible) and any bits of the memory.dmp deemed not useful, be removed/edited out (and a smaller copy of the memory.dmp uploaded instead)?

By "bits" of the memory.dmp i mean, for instance. Removing the handle data or information about unloaded modules. See MINIDUMP_TYPE enumeration

rb_
  • 613
  • 9
  • 24
  • Why not zip it? The compression ration is about 25% on the dump I just tried *(from 100MB to 25MB)*. I doubt that, even with all the clever removal schemes you can come up with, you'll be able to remove 75% of the file and it still being usefull. – Lieven Keersmaekers Jul 01 '15 at 10:33
  • Yup, already being zipped up with the other debug data being gathered. To be fair, what can be compressed, like what can be removed will vary massively depending on the state of the application when it crashes (most of the memory being initialised to 0 is easy to compress). I'm asking mainly to find out if it's even possible to refactor a memory dump file. – rb_ Jul 01 '15 at 10:46

1 Answers1

5

Go with Lieven Keersmaekers's advice first. Really, you want to preserve as much of the data in the dump for later analysis so if zipping the dump is enough, do that first.

To more directly answer the question...

If compression isn't enough, there is a little known trick for shrinking a dump file whose only mention is tucked away in the documentation here.

Shrinking an Existing Dump File

CDB and WinDbg can also be used to shrink a dump file. To do this, begin debugging an existing dump file, and then use the .dump command to create a dump file of smaller size.

So, if you had a dump file taken with .dump /ma, then you could shrink it by opening that dump file and using .dump /mhi. Choose whichever minidump options give you the best usability vs. size trade-off. The i option is a good choice to bring in only the heap memory referenced by the stack.

Caveat: Your mileage may vary with this technique. With 32bit dumps, I've had this trick work without fail. 64bit dumps have acted a bit goofy for me and completely ignored the minidump options I've passed in.

Community
  • 1
  • 1
Sean Cline
  • 6,979
  • 1
  • 37
  • 50
  • 1
    Brilliant, precisely the sort of thing I was looking for. Thanks so much for digging out that little gem for me Sean :) – rb_ Jul 01 '15 at 11:09
  • 1
    Regarding the caveat: is it specific for a combination of dump bitness and WinDbg bitness? – Thomas Weller Jul 01 '15 at 13:03
  • @Thomas When I saw that behaviour, it was with a 64bit dump of a 32bit process. Perhaps 64bit dumps of 64bit processes work better. – Sean Cline Jul 01 '15 at 14:01