What would you do with a BIG spreadsheet on a SLOW computer?
October 22, 2010 11:38 PM   Subscribe

My very important spreadsheet has become too big to use without excel/calc slowing so much that I want to take an axe to the computer. Help!

I have a spreadsheet that I use at work to keep track of a special customer's purchases. A colleague started it years ago, never thinking it would reach 4000 rows. Now that it's at 4000 rows, it's becoming unusable. Every operation, even just opening the search box, takes 5-15 seconds to complete. That's making the job of keeping track of this customer's books a giant pain in the ass. And I don't want it to be a giant pain in the ass, because I like this customer and the books I send her are pretty much the only thing she has to distract her from the illness she has.

I need to be able to enter title, author and date, and the price of each book, plus another column that tells me whether the author is a favourite or has been blacklisted.

The file is saved on a network drive, so that I can use it on any computer in the store. I also have a backup on a flashdrive. Most of the computers at work are slow. The techies are too busy to look into a problem that concerns just ONE customer, and my own personal in-house techie is stumped, so I need to figure this out myself. Do you have any ideas?
posted by pootler to Computers & Internet (20 answers total) 2 users marked this as a favorite
 
You're using a spreadsheet as a database.

Get a database.
posted by orthogonality at 12:05 AM on October 23, 2010 [13 favorites]


In excel, at least, you can turn calculation off, and then hit F9 when you actually need it to recalculate. I assume there's a similar setting in calc.
posted by pompomtom at 12:11 AM on October 23, 2010


Best answer: Orthogonality is right. Spreadsheets are meant to store a small amount of data. You need a database to efficiently handle a larger amount of data. One example of database is software is Access... which is not the most user friendly.

If you have internet access throughout the store, I recommend you set up a free account with Zoho Creator. (www.zoho.com) It's pretty easy to configure and works like a breeze. You'll need to upload your existing spreadsheet to Zoho, and then work from Zoho moving forward. You can export to excel at any time if that's necessary. And it's free!
posted by samthemander at 12:37 AM on October 23, 2010 [5 favorites]


orthogonality is both concise and correct.
posted by Biru at 12:48 AM on October 23, 2010 [1 favorite]


If a database is not an option, do you need all the information at once? I'd archive the older 'records' into older files so you have customer_2005.xls, customer_2006.xls, customer_2007.xls etc so the file you're entering data into is much smaller and more manageable. It might make cross-referencing previous books/authors more laborious but if the file really is grinding the computer to a halt, it may work out faster anyway
posted by missmagenta at 12:57 AM on October 23, 2010 [1 favorite]


4,000 rows really shouldn't be a problem with only a few columns worth of data. are there any formulas?

it sounds as if the sheet itself may be mildly corrupted (yes, that's the technical term).

you might try to save a copy in another format (csv, or an older excel format) and then resave the copy back into the original format.

also, does it open/run smoothly on a fast computer? if so, part of your problem might be network related.

lastly, how large is the file in mb?
posted by kimyo at 1:37 AM on October 23, 2010


I'm with kimyo. 4000 rows should not be a big deal at all. I suspect something else is afoot here.

If you mefi mail me, I can take a look at the file in question. From that point, I should be able to better diagnose the problem you are having. I don't mind doing this at all, in fact, I am a little interested in seeing what is bringing your computer down to its knees.

Also with kimyo in asking how big it is. Is it gigantic?
posted by milqman at 2:06 AM on October 23, 2010


Yeah, without formulas there should be no issue. The problem comes when you have formulas that cause the solver to have to iterate many times to converge on a stable answer. For example, if you had a long circular chain of dependencies (cell 1 depends on cell 2, which depends on cell 3, which depends on cell 4, .... which depends on cell 100, which depends on cell 1) then you run into slowness. This link talks about how to modify the recalculation behavior. But from what you described there should be no circular references so check all your formulas for some hidden sillyness that you can eliminate.
posted by Rhomboid at 2:16 AM on October 23, 2010 [1 favorite]


Use a bibliographical database; they're made for this. Bookends, Endnote, take your pick. I've used both. On the Mac, Bookends seems to be more stable (I might be wrong), but both are and remain fast, and that's the point here. Plus they're made to interact with the common writing applications.

On the main interface you see the entire list. For each reference you can enter the title/author/whatnot info, and then there's a bunch of things you can add; notes in free form, urls, you even can attach files to the titles, such as receipts or e-mail exchanges.
The basics are rather easy to learn. When my old XP computer died, I was also happy to find out that databases from one of these applications can be exported to the other without major hitches.
posted by Namlit at 3:10 AM on October 23, 2010


Response by poster: I'm surprised that 4000 rows is such a problem too. And yep, I knew we were using a spreadsheet as a database - wish I'd had the foresight to start transferring it to a database when I inherited the first 1000 entries.

There are no calculations involved. In fact, the costs are temporary additions - once I've worked out the cost of a particular package of books, I erase the prices and the formula. It's all text, apart from the date.

Depending on which computer I use, it either gets saved in the usual open calc or xl format. using a faster computer doesn't seem to make a significant difference. Only the difference between wanting to take an axe to the PC in 2 minutes on a slow one or 5 on a faster one.

I DO need ALL the info on it each time I use it, to make sure I'm not sending a book twice, or horrors, sending something from a blacklisted author. Otherwise I would have saved it as separate smaller files, sorted by author.

Milqman, what a kind and generous offer. :-) I'll mefimail you - there could well be a problem in the file itself which I am too much of a noob to see.

And I'll look into databases that I can import the spreadsheet into - that might be a good solution.
posted by pootler at 4:05 AM on October 23, 2010


If every little thing is slow, one of two things is likely happening on your old computer: (a) it has a feebly inadequate amount of RAM installed, and everything you do on it makes it thrash or (b) some process or other is using up more than its fair share of the available CPU cycles, leaving not enough for stuff you care about like responding quickly to mouse clicks.

To check (a), click Start, right-click My Computer and choose Properties. If the information box shows you that you have less than 512MB of RAM, that's likely your problem - especially if you happen to be running one of the big-name antivirus/security suites, most of which have evolved into dreadful RAM hogs. I've become quite fond of Panda Cloud Antivirus, which is lightweight enough to leave an XP box with as little as 256MB of RAM quite usable.

To check (b), hold down the Shift and Ctrl keys and then tap the Esc key. This will bring up the Windows task manager. Click the Processes tab, then click the CPU column heading to sort by process CPU usage, then click it again to put the heaviest CPU user at the top of the list, then scroll the list of processes to the top. If anything other than System Idle Task is using a significant amount of CPU, that's your problem.

You might also care to click the Performance tab and check the Commit Charge and Physical Memory numbers. If Commit Charge (current) is as big or bigger than Physical Memory, something is using up all the RAM.

In the last few months I've seen a lot of Windows boxes start chewing up all available RAM and show svchost.exe eating 90%+ CPU for a good half hour after startup, making the computer almost utterly unusable during that time. It's something crazy happening with Automatic Updates, and the workaround is to stop the Automatic Updates service (via right-click My Computer, Manage, Services), then rename or delete the C:\WINDOWS\SoftwareDistribution folder, then reboot. The only downside as far as I can tell is the loss of the history of applied updates when you use the Windows Update or Microsoft Update web sites.
posted by flabdablet at 4:21 AM on October 23, 2010


> The file is saved on a network drive

Just because nobody's said this yet, this might also affect the speed of any operation you perform on the spreadsheet. Or at least, the speed of it getting the instruction and you seeing the result. Test by running it off a local computer's hard-drive.
posted by AmbroseChapel at 4:21 AM on October 23, 2010 [2 favorites]


Response by poster: i'll try out all your other suggestions when I'm back at work on Monday. Keep them coming though!
posted by pootler at 4:34 AM on October 23, 2010


If the file is huge, check for:

A. An accidentally gigantic print range causing Excel to define the spreadsheet as huge.

B. Blank cells that are not empty, which is a common corruption issue. Go to the last row, say 4001, and manually delete all of the unused rows down to the bottom, which will be something like 65,534 (forget the exact number). Then do the same with the unused columns (there are 256 columns). Save and check size via properties.
C. Check for phantom links in blank cells within the active range.
D. If all else fails, copy the day range to a worksheet in a new file.
Oh and if it's on a network drive, try turning off file sharing.
posted by carmicha at 4:39 AM on October 23, 2010


If you're considering moving it to a database, can't go wrong with PHP and MySQL. If memory serves, Excel will export to csv and phpmyadmin will import it.

Et voila!
posted by Biru at 5:42 AM on October 23, 2010


Nthing that 4000 rows is not big. I do "cast of thousands" populations for my novels, this year I can tell you exactly what day in the last 60 years each of the relevant 8500 people died on, how old they were that day, if they won't die until after the story starts, etc., and the feeble XP machine I have does NOT spend five minutes doing a search.

I would save a local copy to rule out the server being an issue.
posted by SMPA at 5:44 AM on October 23, 2010 [2 favorites]


Best answer: The fact that you're saving it back and forth between two different file formats frequently may be a part of the problem, as well.

My experience is more with documents than with spreadsheets, which may be different, but every time you switch formats between Word and other things, there tends to be some cruft left behind that doesn't show up as anything in the file, but is still there bloating the file size, and getting interpreted into nothing on the page.

Given that you don't seem to need formatting, and everything is straight text or dates, I'd save the file as a CSV, rather than as excel or calc. That way, the only thing that gets saved will be the data you need.
posted by jacquilynne at 6:22 AM on October 23, 2010


Sometimes a simple 'Save As...' helps remove junk from a file.

It mostly just reduces file size, but if you're accessing this over a network drive over a slow network, size matters.
posted by mazola at 7:06 AM on October 23, 2010


Response by poster: I do need to be able to sort the fields, so saving in .csv isn't an option.

File sharing is already turned off.

I have no authority or ability to change anything on the computer itself re anti-virus programs etc, and I also have to use this file across about 5 different computers anyway. Don't think the overworked techies are going to want to do that for me. ;-)

So here's what I've done so far: Cleaned it up and got it down to about 3,400 entries. Saved a local copy as .ods. Selected a bunch of empty fields to the right and at the bottom and deleted everything that might be in them. The file WAS 371kb as an xls. Now it's 104kb as an ods. Now the response time is 2-4 seconds. Which I can probably live with. I'm beginning to suspect that it's a problem with the cpmputers and the network rather than the file.
Should I feel the urge to take an axe to the computer again, I'll work through the other suggestions as much as I can and have a look at turning it into a database.

Thank you all. You totally rock. :-)
posted by pootler at 8:44 AM on October 25, 2010


Saving it as .csv shouldn't make any difference to whether you can sort the fields or not. Once you open it back up in either Excel or Calc, you still have the same ability to sort the fields as you would have if you saved it natively.
posted by jacquilynne at 1:34 PM on October 25, 2010


« Older Blue + Brown = Blown   |   I need calorie dense portable food to take to... Newer »
This thread is closed to new comments.