As a new webmaster, what should I do to protect my sites from hacking?
February 26, 2004 8:16 PM   Subscribe

I'm new to the running a website thing. Reading the log files I noticed that people are already trying to hack me. Right now its the basic look for frontpage access (which, obviously, I don't have as I write everything in textpad) but I want to know what steps should I take to secure the server? How can I protect ftp directories, monitor bandwidth theft in the form of linking to images, etc.? can I prevent that all together?

Also, the two most common sources of error pages are browsers looking for favicon.ico (I assume this is gecko and Opera browsers, unless a lot if IE users are bookmarking me) and people (?) looking for pages that don't, in fact, exist but logically might. For example there is an 001.html, but no 005.html. Is this likely to be a person or a bot of somekind? I should add that there is a link to 005.html but it is commented out and has its visibility set to hidden, this way when there is an 005.html all I need to do is remove the comments and change the visibility for the link to show up.
posted by Grod to Computers & Internet (13 answers total)
The 005.html thing sounds like a spambot trolling for any link it can -- commented out or not. Most "hack attempts" you'll see in your logs are just worms and bots sniffing around for machines to compromise.

What web server are you running? Apache? IIS? What OS? A lot of "lockdown" information is very specific to what server and OS you're running.

To prevent bandwidth theft by image linking, you'll have to make sure that image requests have an HTTP refer(r)er from your own server. It's easy in Apache--there are multiple HOWTOs on the matter.
posted by zsazsa at 8:32 PM on February 26, 2004

If your Web server currently runs Apache and is configured to allow overrides, the answer to nearly all of your questions are .htaccess. To address your specific points, you can:

1. Use the <Limit> section of .htaccess to deny access to specific IP addresses, blocks of IPs, or domains, preventing access to those who chronically try finding weaknesses on your server.

2. Prevent people from seeing the contents of directories without index files (i.e. index.html)

3. Use the mod_rewrite module (assuming it's been loaded) to prevent people from hotlinking to images on your server and consuming bandwidth.

...among countless other things, again provided your Web server has them enabled.

Some other notes:

The majority of "people" trying to hack you are most likely spiders, robots, or other viruses sweeping across the Internet looking for resources (i.e. formmail scripts, guestbook pages to spam, etc.) to exploit. If you watch your logs carefully, you'll see dozens of them on a daily basis, but they generally check for their target files and leave. Insistent hammering is usually the sign of malicious users or broken robots. .htaccess can deny access to both.

To prevent the favicon.ico error, put a favicon.ico (16x16 pixel Windows icon) file in the root Web directory on your server. If you don't have any icon, you can make this a zero-byte file, if only to prevent the error.

Hiding links from legitimate browsers will do little to conceal them from spiders and the like. If the address is contained within the HTML, commented out or not, it's fair game for all who parse the page source. As you've already witnessed, security through obscurity rarely works as intended, so don't rely on obscurity in any part of your duty as a webmaster. Your server will thank you for it.
posted by Danelope at 8:46 PM on February 26, 2004

Response by poster: OK. I believe the server is Apache, is my host.
ON PREVIEW: Thanks, the .htaccess link is very helpful!
posted by Grod at 8:54 PM on February 26, 2004

Couple other minor points: You can use a robots.txt file to prevent robots from accessing certain (or all) parts of your site. Check your access logs every so often, just to get an idea of what the traffic is like — once discovered a guy was stalking a friend by doing that (long story).
posted by brool at 10:46 PM on February 26, 2004

The problem with robots.txt is that the majority of robots on the internet -- particularly the malicious e-mail harvesters and comment/guestbook/formmail exploits -- ignore the file entirely. As such, it cannot be relied upon for securing one's site.

Most legitimate search engines do query robots.txt for exclusions, though even the biggest (i.e. Google, Yahoo, etc.) still exhibit flaky behavior from time to time.
posted by Danelope at 11:06 PM on February 26, 2004

Agreed that it doesn't prevent a malicious attack — but it's still a step in properly running a web site. The various spidering engines can take a surprising amount of bandwidth; I was seeing Yahoo hitting my site twice a day, and there was really no good reason for it.
posted by brool at 11:16 PM on February 26, 2004

" is my host"

This being the case your job is 95% done. If 1and1 wasn't protecting the server against the known exploits these spiders and bots are trying to exploit your server would get hacked daily. It's safe to assume they'll handle the fine grain detail of securing the server.

Other than that you just need to make sure your code is secure and doesn't reveal passwords. And if you aren't doing any backend scripting or database access then you don't even need to worry *too much* about that.

If you *are* doing backend stuff, especially database access, it might be a good idea to take a look at some books on web application security. This one is very good.
posted by y6y6y6 at 6:59 AM on February 27, 2004

One clear sign of malicious activity is failed attempts to load formmail.cgi (or variations on that name). This is an old, insecure web-form to e-mail script (so don't use it!) that spammers have exploited in the past, so they'll go trolling for it. I've tried installing a "tarpit" script under that name, but it occasionally trapped legitimate activity for some reason, so I took it down.

I found an excellent list of bots to exclude a while ago, which contains a link to an "almost perfect .htaccess ban list."
posted by adamrice at 7:07 AM on February 27, 2004

To expand on what adamrice said: If you're using any scripts from here (which many hosts have installed by default), replace them immediately with ones from here.
posted by staggernation at 7:13 AM on February 27, 2004

Oh, and if there are scripts installed in your cgi-bin directory that you're not using on your site, delete them.
posted by staggernation at 7:17 AM on February 27, 2004

Actually, I just noticed that 1&1 is on the list of ISPs using the NMS scripts, so that shouldn't be a concern for you. Still good to be aware of, though.
posted by staggernation at 7:29 AM on February 27, 2004

And, in terms of monitoring bandwidth theft of images, this is not too hard if you're already used to looking at your logfiles. Just make sure that the hitcounts on images are roughly equivalent to the hitcounts for the pages they are on. When I've had issues with people hotlinking images, I've seen the image count go to 1000 while the page hit count was at 12. It's fairly obvious and usually easy to deal with by a quick note to the page owner or sysadmin, replacing the image with raunchy porn [and altering your local link to go to the good image you've renamed] or just using .htaccess to ban loading the images when the referrers are anything but your own site's URL. It takes a bit of messing to make sure this works right, but it's well worth the peace of mind.

A lot of the errors you see in your logfiles with people trying to exploit your mailform or looking for frontpage extensions are fairly normal and doesn't mean you are the victim of a targetted attack. Keeping an eye on the logs already makes you more conscientious than most webmasters out there.
posted by jessamyn at 12:33 PM on February 27, 2004

Response by poster: OK, question about formmail. I'm using
Jacks formmail.php and I've followed all the directions to make it "secure" is it? What is the best way to do what formmail.cgi and formmail.php do?
1and1 gives the option to creat forms, but I don't like the wizard interface or the lack of control it provides in terms of how it is integrated with the rest of the site. One odd thing about 1and1 is that (at least with the plan I have) you can't actually view the cgi directory, it is up a level or so higher than I have access. I used putty to figureout the absolute path and get a list of scripts, so I can change my forms over to whatever script 1and1 has installed by default, but I don't have any way to configure the script itself.
Does that make sense?
posted by Grod at 1:46 PM on February 27, 2004

« Older Port 4644?   |   On Accident Newer »
This thread is closed to new comments.