We’ve recently been doing a little work on optimising web sites for the best possible performance. To achieve this performance we’ve made optimisations to Apache’s httpd.conf and set-up static caching servers such as Varnish. Some of these changes in themselves have made huge changes to how a site performs, for example we recently installed Varnish on our own hosting platform, allowing us to serve many more users with exactly the same hardware. However, a tutorial on optimising a server for the best performance is perhaps for another time. In this post we will be discussing post optimisation of a website via load testing.
Load testing services can be expensive depending on where you look, however we have been fortunate enough to find a more or less free alternative. The service in question is called Loader.io, a somewhat free service depending on whether you opt for the more premium options, although for many small website owners the standard tools should work just fine. Setting up for testing is relatively simple, simply register for an account and add a domain name. As an added security measure you will have to verify that the domain is yours as obviously they don’t want you using their services as a means of a DDoS Botnet. Another apt point to mention is that we’d recommend you only really use this service on your own private hosting such as a VPS or a dedicated server. Using load testing on shared hosting could add unnecessary load on resources slowing down not just yours but the websites of others.
After verifying your site in question we can move on to creating our first test. When going through the stages of setting up a new test you can experiment with any of the testing modes with as many users as you wish. In our case we went with maintaining a client load over a large range of users to see how our hosting held up. The details of our testing set-up can be seen below.
As the test runs you should find that the response rate of your server increases as the number of clients increases. This is perfectly normal and makes sense since with more load the server is going to increasingly struggle to serve web pages to users. Although high response times are not ideal it is not until the response time becomes so great that it really becomes an issue. The amount of stress a server comes under can often be tied to the complexity of the website, for example a html site will be able to serve many more users than a fully fledged CMS.
Interpreting the Results
The graph above allows us to get an idea for how much load our server can take. Although a server can continue serving pages with a very stressful load it will become increasingly slower in doing so. So in interpreting these results we can determine what the most acceptable response time for our server should be. In all reality a response time greater than 3000ms or three seconds would probably cause users to turn away from your site. For the example on the graph above we had 280 clients connected to our site when the response time of the server was 3000ms, we can then use this result to further optimise our Apache web server to allow for this many user connections. Although to be on the safe side you should probably set the number of users a little lower such that there is room for a small margin of error.
We very much hope that this post has been of use to you, if you currently aren’t getting particularly good results with your load testing feel free to comment below on what kind of hosting set-up you’re currently using as we plan to make some posts on optimising both websites and servers in the future.