XMHelp!! XML and MyPHPAdmin import problem
June 22, 2007 11:52 AM   Subscribe

MySQL, PHPMyAdmin and XML imports/export problem...

So about three months ago I was running a mediawiki server for some people who wanted to back up their data and go into a fresh mediawiki install. So, I used phpmyadmin as a front end to MySQL, and exported the database to XML, and tossed it on a burnt cd somewhere.

So, this week the people I'm running the server for have come back to me, and they need to get their old wiki back and running under a different database. No problem, I say, and I grab the backed up XML database.

The problem is PHPMyAdmin seems to have no support for importing XML data, only SQL. Does anyone know of a shell script/program/bit of dark magic to translate phpmyadmin's xml into valid SQL that I can import?

This data is mission critical, and needs to be re-loaded, so I cant tell these people to go screw. Any ideas? I'm on Linux, and I have physical access to the machine if it matters.
posted by SweetJesus to Computers & Internet (10 answers total)
 
I've used Navicat whenever I've needed to get data dumps into MySQL and didn't feel like writing a custom script.

It imports pretty much anything -- the only gotcha is you'll need remote access to the db (although you can always get a DB locally, import to that, then dump in SQL).

It may be worthwhile for you to learn about mysqldump and how to import from the mysql command line. I feel that dumping with PHPMyAdmin is a bad idea except for quick backups (ie, 'huh, wonder what this query is gonna do').
posted by fishfucker at 12:33 PM on June 22, 2007


navicat has a 30-day trial, and I'm assuming you should be able to do the import in about 10 minutes, so you should be ok. Not totally sure how well it'll deal with multiple tables, but I guess you'll find out!
posted by fishfucker at 12:35 PM on June 22, 2007


Response by poster: It may be worthwhile for you to learn about mysqldump and how to import from the mysql command line. I feel that dumping with PHPMyAdmin is a bad idea except for quick backups (ie, 'huh, wonder what this query is gonna do').

From what I've read, I think phpmyadmin is using mysqldump to do it's thing. I could be wrong about this, as it's not my area of expertise.

The tool is interesting, and I'll take a look, but I'm looking for a (preferably) free solution. My bosses are not going to be willing to spend any money for software to fix this solution, and as a last resort will probably put and intern on it, and have him/her do it by hand (As we know, interns are cheaper than solutions). I'd really like to avoid this, as it's going to make me look bad if I can't get this data back.

On preview: Didn't see that part about a 30 day trial, I'll give it a shot.
posted by SweetJesus at 12:40 PM on June 22, 2007


huh, i never checked into the source, but I'm pretty sure phpmyadmin 'exports' are just it recursing the entire damn table/database through a template.

which takes forever, sucks up resources and often times out (at least it did on the 600,000 row table I had to dump regularly).
posted by fishfucker at 1:01 PM on June 22, 2007


yeah, grep -R 'mysqldump' * in my phpmyadmin dir just gives me:
Documentation.html:    3.4 I am unable to import dumps I created with the mysqldump tool
Documentation.html: The problem is that older versions of mysqldump created invalid comments like this:
Documentation.html:    appears once in every dump created with mysqldump. If you want to run your
Documentation.txt:3.4 I am unable to import dumps I created with the mysqldump tool bundled with
Documentation.txt:The problem is that older versions of mysqldump created invalid comments like
Documentation.txt:once in every dump created with mysqldump. If you want to run your dump you
plus then you'd probably have to tell it your mysql bin location during setup.

posted by fishfucker at 1:04 PM on June 22, 2007


Response by poster: This database is not huge by any means, just big enough to cause a headache (about a 25meg xml file). I've gone through the process of importing the database through the tool, but I'm a bit confused when it asks me for the "table delimiters" or something like that.

The format of the XML file specifies a < database-name> tag at the top, and then tons of individual < table-name> tags with xml'ized table properties. When navicat asks me for table delimiters, I'm not too sure what it's looking for. I told it it was < wikidb> (the name of my previous wiki database), and it's in the back lab chugging along.

I must say I don't have too much confidence in it working. It's been going for an hour or so, and that's more time than should be necessary for a 3.4 ghz dual processor box with 4 gigs of ram.

Any other implementation ideas would be appreciated...
posted by SweetJesus at 1:57 PM on June 22, 2007


Response by poster: You're right about phpmyadmin not using mysqldump. I think I might have crossed up something I've read while trying to debug this issue.
posted by SweetJesus at 2:00 PM on June 22, 2007


Best answer: 25 meg XML file?

yeah, there is something way wrong .. It's not gonna go through ...

hmmmmmm. unfortunately the phpmyadmin stuff i've read so far suggests that their xml tool is NOT for backup (probably for an import to SQL server or something), so things may get a little wacky ...


hmm. ok, here's a possible solution -- hope someone there has office:

1. dump with phpmyadmin to XML (you already did this! yay!)
2. create new database in MS Access
3. File/Get External Data and select the XML file
4. it'll give you a table list, click ok
5. it'll say there's errors but it looks like that is because it can't deal with the comments.
6. save your mdb and close out of access
7. open mdb with Navicat (again, you may need the windows version for this).
8. import into your mysql install
9. crack a beer
posted by fishfucker at 3:56 PM on June 22, 2007


btw, my guess on why the xml file isn't working is that the navicat isn't smart enough to break up the schema into tables -- access does this swimmingly though. i would totally cancel out of that navicat process -- i hope it's not hooked up to a production db at the moment.
posted by fishfucker at 3:57 PM on June 22, 2007


Response by poster: Thanks for the advice. We ended up deciding that we really didn't need the data anyway (or at least we didn't want to jumpthrough the hoops we needed to get it), and are looking at other solutions.

However, I appreciate the effort. Thanks!
posted by SweetJesus at 2:36 PM on June 29, 2007


« Older Am I creative or is it all coincidence?   |   Seeking Testiomnials From Arise Certified... Newer »
This thread is closed to new comments.