blocking "ilicit" sites
June 2, 2005 12:16 AM   Subscribe

i'm in a central-asian country right now working with an NGO run hospital. they've recently acquired satellite based internet access, and are concerned about limiting website access for their staff.

basically, instead of using keywords to block sites and searches, we are hoping to use blocklists (ideally the free DNS query lists at rhs.mailpoilce.com) to referenec with our local DNS server - Microsoft Windows Server 2003

at this point, i've got no idea how to have server 2003 do this, or if possible, set up another linux box to act as a gateway. any ideas?

thanks!!

-tim
posted by quadrinary to Computers & Internet (3 answers total)
 
Get some free or cheap blocking and squealer software from somewhere here.

If the problem is that the hospital lacks the bandwidth to allow employees to download silly crap, tell employees that the names of the top ten downloaders will be posted publicly (or sent to their managers). But maybe encourage them to use the web during off hours by telling them you don't count downloads made between certain hours.

If it's content (or content plus bandwidth) that you're worried about, try using software that lets you build a list of good sites instead of filtering out a list of bad sites. Then make a simple procedure for requesting that a site be added to the list. Within a short time, assuming you respond to requests promptly, the list will contain all sites that employees regularly need.
posted by pracowity at 3:21 AM on June 2, 2005


If you are dealing with a network, rather than just a single machine (I'm assuming so, given the server), what you need to do is install a proxy server, and block normal outbound/inbound web traffic, then force the browsers through the proxy server. This becomes your point of control.

Then, you configure the proxy server. Since you're using Windows, I can't offer you specific products -- I just don't know the market. I know Microsoft has one (having dealt with problems with it...) but I don't know costs, etc.


pracowity's comment about whitelisting rather than blacklisting is techincally correct, but labor intensive, unless there's only a few sites you know they need access to.

If bandwidth's an issue, many proxy servers and routers can also be bandwidth throttles. 56K per makes web pages slow but still useable, but makes downloading warez painfully slow. The sat router itself may well have bandwidth controls as well.
posted by eriko at 4:26 AM on June 2, 2005


Need to move back a step. The list you linked there is an RBL, designed to let mail servers check whether or not incoming mail is originating from a known spam source. RBLs happen to be implemented in DNS because it's a good lightweight and distributed way to store information about domains, but an RBL provided via a nameserver has nothing to do with regular DNS lookups.

If you want to use an RBL to control Web access, you need a proxy capable of checking an RBL -- but I don't think that will be that easy to find, because "block users from going to domains that have appeared in spam" isn't the sort of thing that people typically enforce in a Web proxy.

If you want to be able to maintain a blacklist of Web sites that provide objectionable content (and filter on the sites' URLs and addresses, and not the content), then you can get what you want with DansGuardian. There are commercial and free blacklists maintained by third parties for DG which seem to be kept reasonably up to date. They'll have categories like "sex" and "gambling" and "hate sites" and so forth, while RBLs are only going to give you "once appeared in spam".

In any case, the filtering isn't typically handled in DNS (mostly because that's trivial to work around). Filter websites in http instead.
posted by mendel at 8:37 AM on June 2, 2005


« Older How do I ship my life to the UK?   |   peru the noo. Newer »
This thread is closed to new comments.