website security
March 31, 2009 6:00 AM   Subscribe

Can files be hidden on a website in such a way that virtually all visitors will not be able to access them without a key or specific url?

I wish to share with certain people sensitive information. It is not financial information or of a sort that I am worried hackers would want to get a hold of. It is information that I don't want to share with the general public and which some individuals may want to see. They probably aren't going to come looking for it, but if they do, is there a way to make it secure and hidden?
posted by dances_with_sneetches to Technology (14 answers total) 4 users marked this as a favorite
 
You can password-protect the directory. It's pretty easy on an Apache server. Even better, don't make a link to the directory - email the link to people who need it, but don't have a link posted on the site that directs anyone to that directory.

If indexing is turned off, no one will know the directory exists unless you tell them it does. Anyone attempting to access it will need to enter a user name and password, or will be rejected.

I've done this for personal reasons, and also for a departmental website, to hide some portions of the site from those who shouldn't see it, while still allowing others in (for example, job application data on the site was hidden from the public but accessible to those on the search committee).
posted by caution live frogs at 6:07 AM on March 31, 2009


If you own the website, you can place the information in a folder with an obscure name, and then place a blank index.html file in that folder so the contents are hidden. Then place your files in there. However if somebody knows the direct URL of the information they can still access it.

I would suggest a .htaccess restriction on the folder, which would require a password in order to be able to view anything on that directory or any subdirectories of that protected directory. Depending on your hosting provider they may be able to assist you with creating a .htaccess file. If not google it and I'm sure there are some automated scripts out there that create them for you.
posted by InsaneRhino at 6:10 AM on March 31, 2009 [2 favorites]


The .htaccess solution is the easiest. If you have web programming skills you can use a cookie authentication scheme. The general idea is you include a small script at the top of the file that dies if authentication fails, preventing the rest of the page from loading.

As mentioned, make sure your page isn't going to be indexed by google. Be sure to put in a robots.txt file to prevent this (google robots.txt). In this way, you can simply create an oddly named directory within your root to house the file. Keep in mind, if a hacker ever DID get into your server, that'd be the first place they'd go.
posted by teabag at 6:22 AM on March 31, 2009


why not just use hamachi and set up a windows file share for the data? this way you have a totally encrypted VPN to transfer your files.
posted by Mach5 at 7:04 AM on March 31, 2009


Response by poster: Thanks. I'll check out the htaccess. Mach5 - I want to do more than transfer files. I want to display information that is accessible as needed. live frogs - how do you turn off indexing?
posted by dances_with_sneetches at 7:21 AM on March 31, 2009


There are technical solutions, but before you jump into one of those, understand that if you put something on the web for long enough, somebody is going to find it, no matter how good your security is.

So, yes, you can make the data "secure" (for some definitions of that term) and you can make sure it's "hidden", but you certainly can't make sure that only the people you want to see it can actually see it. Even if your security is air-tight, once it leaves your server (i.e., is downloaded by someone else) you have zero control over it.

Also keep in mind that if you host your data through someone else, there are a ton of folks with access to those files (the host's employees). So at some level you have to decide what is an acceptable level of risk.
posted by toomuchpete at 7:24 AM on March 31, 2009


You turn off indexing by including this line in the .htaccess file:
Options -Indexes

Including password protection is almost as simple; the apache link clf gave you earlier explains it in detail.
posted by ook at 7:34 AM on March 31, 2009


Another vote for starting with creating a robots.txt file. Just set up a website for my company and looking at our tracker Google (and about a dozen other crawlers) index our site as much as three times a day. For more info see the link below.

http://en.wikipedia.org/wiki/Robots.txt
posted by Gainesvillain at 7:43 AM on March 31, 2009


I wonder if Google Docs is secure enough for your purposes. You can set documents to be private, while still sharing them with specific people. It certainly would be much less hassle then setting up something on your own.
posted by oulipian at 7:53 AM on March 31, 2009


On the other hand, putting a folder in robots.txt is just another way of saying, 'look a folder is here!'. Google will ignore the folder, but any human reading robots.txt could be intrigued.

Security through obscurity is bad!
posted by cayla at 7:56 AM on March 31, 2009


You've got to double layer directory in your robots.txt because the entry in robots.txt is a fireworks display saying look here to anyone looking at the file. I've even used the contents of robots.txt to download most of a website via httrack.

So set your robots.tx to instruct bots not to index /secret and then put your files in /secret/randomstring with a blank index file in /secret. That'll stop honest robots without giving away the farm to the nefarious.

That all just bandaids on a good password protection though.
posted by Mitheral at 10:12 AM on March 31, 2009


Read the Apache docs (or IIS docs) on authentication, and implement whatever works best in your environment (Active Directory, Digest, mysql datbases, etc.).

Apply SSL certificates to the directory so everything including the login procedure (cookies, digest, basic, database, etc.) and content is encrypted during transfer. This makes it more difficult to dissect out the actual password hash in transit if you use a weak form of hashing or the passwords are simple.

If you want to be really paranoid, do realtime watermarking username and time downloaded on a non-editable document (editing-locked PDFs work fine, somewhat complex to implement). This tends to discourage end users from forwarding confidential documents to others when their name is all over it.

Robots.txt and obscure directory names are fairly weak methods of defense. One forwarded or posted hyperlink in a public fora will expose your site.
posted by benzenedream at 11:21 AM on March 31, 2009


Why don't you just put the content in an encrypted zip file?
posted by glider at 11:38 AM on March 31, 2009


Zip's encryption is laughably insecure. It you want to take this object level approach use something like truecrypt.
posted by Mitheral at 12:13 PM on March 31, 2009


« Older Where can I buy a DVD duplicator in the...   |   Do I get a second chance on an invisible fence? Newer »
This thread is closed to new comments.