Join 3,521 readers in helping fund MetaFilter (Hide)


Giant but simple database - how to build
February 5, 2010 5:53 AM   Subscribe

What is best method to create a non-techy local interface for a 40M+ record database that I'd like to host on my web server? I'd like to regularly import new data, crunch it, and export specific CSV reports (which will be much smaller than 40M rows).

The data will be essentially a 40M row record of address files. I will receive new data every month. I want to query it and export reports that will be a 70K row DB. Ideally these could be automated and output to a webserver.

I will also want an easy way to just look at and play with the data a bit. I am familiar with MS Access in running basic reports, etc, and would love to be able to use that to play with the data (rather than getting fancy with SQL all the time).

I have a webserver with mySQL setup and running.

I plan on using an Elancer to set this up for me and get it running with sample data. What should I ask them to do?
posted by mtstover to Computers & Internet (2 answers total)
 
You need to ask them to do what you've described above. They ought to be able to help you create a slightly tighter spec: how to secure it, maintain the code, and so on.

As far as tools and frameworks, there are lots of options out there - Django, Catalyst or simple PHP. All of these provide the sort of functions you're looking for with varying degrees of complexity. You might also take a look at the off-the-shelf mysql front-ends out there.
posted by jquinby at 6:28 AM on February 5, 2010 [1 favorite]


Django. It would be pretty ideal here since you get a built-in admin interface for free on top of your database schema. Additionally, python has nice built-in support for reading (and writing) csv.
posted by i_am_a_Jedi at 6:42 AM on February 5, 2010


« Older My display cabinet cost me $10...   |  I’m going to Japan for the fir... Newer »
This thread is closed to new comments.