Why is my excel VBA code execution randomly slowing down?
May 21, 2008 12:07 PM   Subscribe

Sometimes excel VBA code executes very slow. Sometimes it's really fast. I can't seem to pinpoint why.

I'm familiar with application.screenupdating = false and turning off autocalculation.

This happens with a variety of my programs, but most notably with a certain one that simply iterates through an array (i,j) and prints each value (i,j) to a cell on the worksheet. Just your basic nested for loops with a .cells(i,j) = array(i,j) in the middle. (ok, so the real code is more complicated, but as I explain below, this isn't the only code that has the issue, and several coworkers who use different code experience the same thing. I'm not looking for coding help, rather, trying to see if this is a common problem that others experience, or if it's really just a local company phenomena)

Sometimes when I run it, it can whip through a 500x100 array in just a couple seconds, other times, it slugs through it taking almost 10 minutes.

This is always the same type of data, mind you, and there are never any formulas or cell formatting to worry about. Just a blank workbook with one worksheet that needs to be filled with the data from the array.

A few other coworkers notice similar things.
Sometimes restarting excel seems to clear things up. Sometimes restarting the computer helps. Oftentimes it doesn't and seems to clear up randomly on its own later.

Has anyone ever noticed similar issues?

My guess is that perhaps there's a memory lag/leak/etc or something that is degrading performance, but I would assume restarting the app or computer would fix that. We usually attribute it to 'the server being slow' but we notice similar issues even when files are local. It's just really frustrating to be testing a bit of code that is blazingly fast the first few times you test it, then suddenly it screeches to practically a halt seemingly randomly, then later that day or week, it's all back to normal like nothing happened.

Thanks
posted by johnstein to Computers & Internet (8 answers total) 2 users marked this as a favorite
 
This is probably a silly question that you researched a long time ago, but have you noticed a load on the machine at the times when the code is slow? Is it during an anti-virus check? Crap like that?
posted by fusinski at 12:14 PM on May 21, 2008


Response by poster: That's a fair question. I've checked in the past, as you said, but haven't checked recently. Currently my code is slugging by so I just checked the performance.

EXCEL.EXE is taking up 50% of the CPU
System Idle Process is taking up about 47-48%, with firefox and acrobat and a few other programs fighting over the last 2-3%.
posted by johnstein at 12:19 PM on May 21, 2008


Best answer: You might want to try modifying your code to write to a range, instead of writing to individuals cells through a for loop. I made a similar modification to a statistical analysis package I was working on a few years ago, and the speed increase was amazing.

This page is a pretty good post on the topic.
posted by steinwald at 1:05 PM on May 21, 2008 [2 favorites]


The most common reason why performance of interpretive languages varies drastically is garbage collection.

The easiest way to avoid it is to save your spreadsheet, terminate Excel, start it again, reload the spreadsheet, and then do your calculation. The memory heap should start clean.
posted by Class Goat at 1:53 PM on May 21, 2008


I see that you've tried restarting Excel. The problem here is how to make the VB runtime libraries unload, and I'm not sure that's easily done.
posted by Class Goat at 1:55 PM on May 21, 2008


Class Goat - You could write a native VB application that drove an Excel object.

Terminating the application should unload the resources, assuming objects are properly handled.
posted by steinwald at 2:00 PM on May 21, 2008


"... We usually attribute it to 'the server being slow' but we notice similar issues even when files are local. ..."

If it is really this random, I'd strongly suspect that some element of what you are doing is not local, and is, in fact, server dependent. I've seen things as weird as shadow copies of what users swore were local files cause big slowdowns on networked machines. I can't say definitively that such a thing is the root of your problem, but I would sure be looking hard at network traffic, in conjunction with CPU usage, when such slowdowns occur.
posted by paulsc at 2:03 PM on May 21, 2008


There's still a few more things you can do:

(i) Set application.enableevents = False
This can make sometimes make a massive difference. Don't forget to set it back to True on every exit path from the procedure.

(ii) If you have opened the VBE at any time for that instance of Excel it will run more slowly. Even if you close it again. Restart Excel to resolve.

(iii) The most noticable performance gain (depending on the size of the array) is never to loop through an array and write to individual cells as you seem to be doing if you can help it: .cells(i,j) = array(i,j)
If you're writing to more than a hundred or so cells in one procedure you should write to a variant array and then dump that in one fell swoop to the worksheet. It is MASSIVELY quicker - test it and see. Something like:
Range(sh.Cells(1, 1), sh.Cells(UBound(myArray, 1) + 1, UBound(myArray, 2) + 1)) = myArray

(iv) Memory leaks. I generally haven't noticed any in Excel (and I used to do a lot of VBA) but you can introduce them in your code if you don't explicitly set objects to Nothing in your procedures.

Only (ii) and (iv) would explain the inconsistencies however. (i) would also, if you happen to have other AddIns loaded occasionally that might be firing up when you write to cells which will raise sheet change events. On preview - paulsc makes a good point.
posted by NailsTheCat at 2:14 PM on May 21, 2008


« Older Programmable keychain for spontaneous Willhelms   |   I would like to use my GPS receiver to analyze my... Newer »
This thread is closed to new comments.