Apache Bench does a decent job of giving you a quick assessment of whether or not an application can handle a load, but for a more advanced performance assessment, better tools are required. Visual Studio has excellent support for load testing, but using it requires Visual Studio Enterprise (or Ultimate--depending on your VS release). Not everyone can afford that investment; luckily there are free utilities such as Apache JMeter that are more robust than Bench.
JMeter is a powerful solution, but unfortunately getting started can be difficult. The application isn't the most intuitive, and although the Apache documentation is available online--and very thorough--it often follows the technical approach of other Apache projects, so specific getting started documentation that covers all the basics is sparse.
JMeter is a powerful solution, but most people are just looking for a way to quickly benchmark the more intensive pages of an application. For the medical education testing system that my team and I built, the exam pages were the most database intensive, and since database connects are the top bottleneck of any system, this was the page we wanted to test. Unfortunately, under the time crunch that we had, load testing didn't begin until four days prior to launch. Our application was behind Shibboleth authentication, and the testing system required session information to navigate through the exam. On top of that, the exam is more than just providing question and answer information, but also saves responses, strikes through questions, marks questions, and allows for highlights. Properly load testing the entire process without setting up a recorded script in a more advanced system like Load Runner wasn't going to be possible given our resources and timeline. Instead, I had to pick a specific set of pages to target and run through each of them in a makeshift process.
Fortunately, JMeter allows you set up connections for various pages, and then run through them all during a testing cycle. Spoofing Shibboleth authentication would require session variable manipulation, but our Shibboleth implementation is hosted by a central authority and not us. Although I could spoof my own account, I wouldn't be able to spoof additional virtual users without them all being me. Instead, we created a duplicate instance of the application, removed it from Shibboleth, and enabled Windows authentication instead.
With JMeter, you start by creating a test plan, and within it, a thread group. It's on the thread group that you can specify the number of threads, the ramp up period, and the loop count.
For authentication purposes, you can add an HTTP Authorization Manager to your thread group. This will allow you to insert the base login URL, along with a username and password. If your application uses Windows Authentication (like our test load application did), you can enter the domain as well. In addition to the HTTP Authorization Manager, we added an HTTP Cache Manager to the thread group in order to clear the cache for each iteration.
Once you have these managers set up, you can start adding HTTP Request Samplers. Each sampler is a page that the virtual users will hit when the thread group spins up. These samplers are flexible since you can add parameters, form data, and even file upload data. You can get very close to emulating the actual page request/response lifecycle by using the options on this page.
Of course, what good is a load test without reporting. JMeter has several listeners that result in standard reporting of time in milliseconds, aggregate reporting, and results tree reporting (which allows you to view the results of each request. JMeter also offers graphing, such as for response times, so you can see a visual representation of the application's performance.
Running the entire test consists of choosing the number of users, the number of pages hit in a session, and number of loops (times to run through all the pages). This is all configuration, and you're encouraged to play with the settings to see what causes stress on the system.
In analyzing the results, we saw variable load times--sometimes entire runs had an average page load of less than two seconds, and sometimes they had an average page load of upwards of ten seconds. From running through various scenarios, it looked like the stress point was around 50-75 concurrent users running through about 10-20 pages a piece. The application held steady at around 1,000-1,500 page transactions in less than a minute.
Was this going to be sufficient? At that point in testing it was hard to tell. We knew at the very least the application would need to handle 156 students hitting the system over a 10-15 minute period. Without a test run with actual users (to analyze their behavior--such as how often they actually click around versus reading) the stress on the system was still up in the air. However, at least we had metrics, so we could predict at what point of stress the system was likely to start experiencing issues. We knew that the application would hold for several thousand transaction, but if the students were fast with their behavior, it was possible that we could run up database connections and experience lag.
These metrics--although not definitive--allowed us to develop a game plan for any potential issues. Issues that we did eventually experience, but I've saved those details for a launch postmortem on which I'm working.