Best platform for database, forms, etc
April 2, 2021 1:56 PM   Subscribe

My wife's nonprofit organization has a medium-sized database running in Microsoft Access 2010 on an antiquated server running Windows Server 2008. It's getting increasingly creaky, and she's been trying to push them towards migrating to a new platform. What are the options?

The problems with the current setup are:
  • It's slow because of the old hardware.
  • It's flaky. Sometimes the reports start randomly dropping results that should be included, and somebody has to press a magic "Compact and Repair Database" button to fix it, which terrifies me.
  • Scripting in VBA is fucking terrible.
  • Nothing is version-controlled, so it's hard to roll back changes when something breaks.
I'm trying to help them figure out what platform to migrate to.

They need to be able to do the following standard things:
  • Create forms to input data.
  • Write and store queries to run reports.
  • Query the database directly sometimes.
  • Bulk-upload data from spreadsheets (mainly Google Sheets).
So, the actual questions:
  1. For hardware, my feeling is that it'll be simpler (and more cost-effective) to pay for cloud servers that can be upgraded as necessary rather than physical hardware. Is this accurate? (Amazon is out for ethical reasons.)
  2. For software/db platform, is there a better all-in-one solution than Microsoft Access?
  3. Or should they go with something like Django and postgres, where there will be more setup work required but (hopefully) more control and flexibility? Or some other roll-your-own platform?
  4. Other things that I'm not considering?
posted by number9dream to Technology (9 answers total) 4 users marked this as a favorite
 
Have you looked at airtable?
posted by pyro979 at 2:33 PM on April 2, 2021 [1 favorite]


SQL Server Express is free as long as your data set is under 15GB (I think that's the current limit) and should work well with Access as a front end. If the main problem is they're having is just the crappiness of Jet requiring frequent database rebuilds, moving the data to SQL server instead would probably be the easiest solution.

If you do want to go with a different solution for whatever reason, I really like PostgreSQL.

Unless you're talking 50+ users, you don't actually need anything particularly fancy in terms of the server. I've got clients with 25 users using software that uses SQL server running on old quad core Xeons with spinny disks without any complaints about speed even with some fairly significant virtualization overhead. You can get Lenovo servers with plenty of grunt for small organizations and an SAS card for under a grand.

I'd probably be more in favor of cloud solutions if decent internet service were more readily available, but the folks I deal with often only have VDSL or DOCSIS available at their sites and only rarely both.
posted by wierdo at 2:49 PM on April 2, 2021 [2 favorites]


Some people swear by FilemakerPro
posted by Morpeth at 3:39 PM on April 2, 2021 [1 favorite]


The #1 consideration here is who is going to manage whatever solution it is. Is it you? If so, how much time are you interested in putting into this on an ongoing basis? If it's someone else, what are they comfortable with? Is it everyone (ie, no one)? If so, how technical is the average person?
posted by inkyz at 4:03 PM on April 2, 2021 [9 favorites]


It's flaky. Sometimes the reports start randomly dropping results that should be included, and somebody has to press a magic "Compact and Repair Database" button to fix it, which terrifies me.

OK that seems bad. I used to do quite a lot of MS Access work, and, as I recall, the only thing Compact and Repair was good for was actually defragmenting and compacting the structures used to store data within the .mdb/.accdb file. These become fragmented very quickly, and the compact utility can produce dramatic reductions in the size of the database file.

Now, the relevance here MAY be that your Access file is very large, and that it sometimes grows beyond the 2 GB size limit for Access databases.

I have actually seen one in the field that this had happened to! The users did regular large data pulls from a linked ODBC table, and they never deleted any rows. I was called in because the Access database was showing really odd behavior, with irrelevant sounding error messages coming up if you tried to open a query. I think it was stuff like not being able to resolve the built-in Now() function to get the system time, when the simple query showed no signs of referencing anything to do with dates or times at all. This is what happens when you stray beyond the forbidden size - you don't get an error message informing you: hey chief, your Access database is now too big, you get unpredictable and cryptic errors.

So something like reports dropping results until Compact and Repair is run, reducing the file size below 2 GB so that the thing will work again, sounds like a really consistent theory that is worth testing.

Man I am glad to be out of that. What a mess.
posted by thelonius at 4:43 PM on April 2, 2021


I know this is not exactly what you asked, but are they sure they need a general purpose database? So many of the things I see people using Access for (membership databases, inventory management, etc.) are problems that are also solved by software that is designed to do that specific thing. Most organizations' needs are not as unique as they think. It may at first seem that specialized software is more expensive and less flexible than building your own database, but I suspect that if you factor in maintenance hassles, the cost will be competitive and the lack of flexibility an ok tradeoff...
posted by primethyme at 4:48 PM on April 2, 2021 [3 favorites]


Best answer: This is the sort of work that I do/did in the non-profit world. And this has three parts - the database, the DB Interface, and hardware. It is a big ask to pitch all of the existing systems, that mostly work, for something may work, will have new bugs, but regardless will be new and different. Staffing, cost and adaptability will the primary questions that will help make this decision.

So first, the database. The easy solution is incremental, and just migrate it to the new version of Access. MS provides a reasonable path for this and most of the existing scripts should (id they wren't crud to begin with) work with updates. thelonius's note about file size might be key - Access still has the same size limits. Version control of various forms is supported. Another solution is an alternative database - like MySQL, PostgreSQL, these have the benefit of broad support, can be cheap to operate, but the cost of people who set up and run these is expensive. Version control on these is still a headache. Security is also a major issue - improper setup = disaster. The third option is to go with a SAAS analytic/reporting solution that is entirely online like Tableau or maybe a CRM, it's not clear what sort of data you are handling. These trade out the expense of human labor for just expensive software licenses, plus any training/help is still pretty expensive. It's lovely if you like paying for consultants.

The DB interface is described as forms and "VBA sucks" which, uh, I have bad news then about all of the alternatives. VBA isn't below average - and its limitations is not something I would expect to be an issue for moderate sized projects. I have a project that runs VBA into an MySQL and it's fine, serving thousands of people. Other options have the downside of starting all of your forms from scratch, which for a few forms isn't a big ask. But new forms means new training the staff means new bugs. Clean slate does allow for a fresh approach to the whole process, which in my experience becomes a major delay as all of that gets hashed out. I'm also curious about all the extra scripting that needed to happen - and why these needs weren't met with existing report functions. That is an organizational issue, and might not be solved with just a different DB.

As for hardware? I have a mix of services local and cloud, and I like Azure, MS's cloud solution if I had a clean sheet start from nothing project. It's nice and not particularly expensive, but there is still a steep learning curve, it's just not as bad as digital ocean (from the outside anyway) or AWS (the terror, the potential costs). The level of service we are at means that we have never had an issue with downtime, unlike Network Solutions which took 13 projects down a few weeks ago. The budget has to be made to accommodate the steady cost, which can be an issue in the feast and famine mode of many non-profits.

I have moved everything local to a single Dell server, and in 2021 it just feels more like a liability than an asset. I just feel outgunned, even behind a nice Barricuda firewall someone else provides, even with years of running my own server, even with a couple of years without issue, the risk feels greater. The machine gets hammered on in waves, and it makes me want to take the next step and just docker/container all the VM's elsewhere. Or get a new job where it isn't my problem anymore.

If there was an easy inexpensive solution they would have allready done it. I have managed both types of transitions, incremental and clean slate, there is no magic answer that fits every situation, both can take longer to complete, both can introduce new problems. The fact that the org has put itself in this position seems indicates that they can't manage change well. But perhaps they have been transformative in other areas and simply need to focus their energies on this issue.
posted by zenon at 4:52 PM on April 2, 2021 [7 favorites]


Impossible to say without spending a few days getting into the guts of the database.

It may be possible to do it in multiple stages rather than a full-sweep replacement, but it's not necessarily better.

Depending on how the table is accessed, it may be possible to replace the Access JetDB engine with MS SQL, and simply migrate the data over, with minimal rewrite of the forms and reports. That should increase the speed significantly, IMHO. Stick with the free express server which has a limit of 10GB database size for now. Developer license is free, IIRC.

Once the immediate problem is solved, it's time to consider long-term migration, and this is where you need to decide to stay Microsoft, or do a full-rewrite from scratch. But that should only be decided AFTER you have FULLY documented EVERY report, every form, and EVERY function the current version does, i.e. 'write the specs" so you know what you have to recreate on a different platform.
posted by kschang at 12:42 AM on April 3, 2021 [1 favorite]


Hi! Longtime nonprofit database admin here, who has previously worked at a nonprofit database company and still does consulting in this space.

You don't say what exactly they're tracking in this database, but I would assume it's constituents using their services and/or donors? If so, there are tons of cloud-based CRM (constituent relationship management) services out there that will meet these needs and require little to no infrastructure upkeep on the nonprofit's end. These as inkyz noted, a big consideration here is that a lot of non-out-of-the-box solutions are going to require significant ongoing investment. CRMs are managed solutions so the upkeep is really with the actual database management as opposed to the infrastructure.

Here's a good list of options to start with.
posted by anotheraccount at 6:57 AM on April 5, 2021 [3 favorites]


« Older Recommend a field microscope to use with an iPhone   |   Supporting a coding camper Newer »
This thread is closed to new comments.