So I've set up an old PC as a webserver using my home ADSL connection, mainly for my own education (and I'm certainly learning a lot!). I use it mainly to serve my blog, which up till now has been a very small-scale personal project for friends and family (awstats reports 4361 hits last month). I've been thinking about posting a few longer articles about semi-obscure but possibly useful-to-some-people topics (managing bibliographies in Openoffice, HDR photography in linux, etc). Thus, I'm wondering how my connection will cope if several people try to connect at the same time - this also made me wonder about image/page optimisation, etc.
So, in the interests of learning more about this stuff, I need advice on how to simulate multiple connections to my server. At first I though that there might be a free online service for this, but then I realised that such a service would be an easy way to DOS someone else's server, and was thus unlikely to exist. I have access to a (linux) work computer on a much faster connection, so presumably I could simulate multiple connections to my server from here (I'm thinking upstream bandwidth from the server will probably be the bottleneck). I've had a quick look at openwebload
which seems to be along the right lines. Ideally I'd like to be able to test the responsiveness of the front page, different pages, static vs dynamic pages, images on my server vs hosted elsewhere, etc with different numbers of connections.
I know that the sensible thing to do if I were expecting a large amount of traffic to a particular article would be to have it hosted somewhere else, but I'm a geek and I feel I might learn something if I try it myself. I know a lot of MeFites are sysadmin etc., I'd be grateful for any suggestions. Just ask if you need more details.