Need a Beefy Site-Map Generator
November 11, 2008 10:16 AM Subscribe
Attention Web Folk: I'm looking for a visual site-map generator, preferably cheap or free, that can handle an absolutely labyrinthine site. Details follow!
I run a site that has thousands and thousands of pages, some of which are automatically generated. Because of of absent-mindedness or scripting errors, some of these pages have dead or misdirected links on them, or maybe I just forgot about them.
What I would like is a program that would go through and index all the pages in a visual (or at least nested) format; that is, it would show one page and show all the pages it links to, and then all the pages THEY link to, and so on. Basically something that I can skim through and see that, oh, this page has some bad links on it.
I have found a hundred different site map generators, but those all just list all the urls. I need something a little more informative!
Thanks Hive Mind!
I run a site that has thousands and thousands of pages, some of which are automatically generated. Because of of absent-mindedness or scripting errors, some of these pages have dead or misdirected links on them, or maybe I just forgot about them.
What I would like is a program that would go through and index all the pages in a visual (or at least nested) format; that is, it would show one page and show all the pages it links to, and then all the pages THEY link to, and so on. Basically something that I can skim through and see that, oh, this page has some bad links on it.
I have found a hundred different site map generators, but those all just list all the urls. I need something a little more informative!
Thanks Hive Mind!
If you're specifically looking for dead links, use Xenu Link Sleuth.
posted by softlord at 4:48 AM on November 12, 2008
posted by softlord at 4:48 AM on November 12, 2008
« Older How did Reunion.com know we were related? | How to create business cards other people can... Newer »
This thread is closed to new comments.
Among other things, wget can recursively download an entire web site by following links from its index file. I'd just tell it to do that, using its quiet option to suppress all the helpful noise it usually generates while it's doing its thing. That should mean that the only thing it spits out is errors for the links it can't find targets for. You'd still end up with a big flat list, but it would be a list of precisely those links that need fixing.
posted by flabdablet at 4:20 PM on November 11, 2008