Monitor a program's memory use
September 14, 2006 5:54 AM   Subscribe

I'm writing a program that uses lots of memory. I'd like a utility that can show me what it's using the memory for. For example, one part of the program processes an XML file, creates an object and loads a WebBrowser .NET control. I'd like to see how much memory was required to parse the XML file, how much memory my object is using, and how much memory the WebBrowser control uses.

I'd prefer an external utility to do this, rather than adding monitoring functions all over the place in my code.

I'm using C# and Visual Studio 2005 if that helps.
posted by matthewr to Computers & Internet (3 answers total)
Best answer: The magic phrases that you want to search for is profilers or profiling. MSDN has a list of such tools for Visual Studio. On that list I see CompuWare's tool which is commercial offering but good. In general these tools usually break down in the sense that the commercial, pricey offerings are easier to use and have some hueristics in them to bring the problem areas to the forefront. The cheaper + free tools tend to work but you need to be more sophisticated in how you use and interpret the results you get.
posted by mmascolino at 6:39 AM on September 14, 2006

If this were C or C++, I'd suggest an instrumented operator new, or even an instrumented malloc if the XML processing is your code. You can find ready-made source for this sort of thing, and it's not too intrusive. (Basically, you change the link library to the instrumented one.) Whether ort not that's possible in C# I don't know. I don't see an easy way to instrument the .NET control.

You can also just do some rough math. The parsed XML will be a bunch of nodes and a bunch of strings (either C strings or some String class). Figure out sizeof( node), figure out the average string size (for a c string it's probably rounded up to the nearest 16 bytes, for a String class, it's probably sizeof(String) rounded up to the nearest 16, + average length rounded to the nearest power of two), then multiply (node size rounded to the nearest multiple of 16 + string size ) * nodes for a rough idea.
posted by orthogonality at 7:58 AM on September 14, 2006

Response by poster: Thanks, mmascolino. I decided on SciTech's Memory Profiler, which seems to be working well.

Orthogonality, thanks for the comment. 'Rough math' works fine for the XML parsing stage, but it doesn't help me work out the memory usage of the various objects I populate based on the XML data - the class structure is fairly complex and it would take ages to work out any reasonable estimate.
posted by matthewr at 8:29 AM on September 14, 2006

« Older Help me make good coffee   |   "Recommend This Post" Blog App Integration. Newer »
This thread is closed to new comments.